Automating Usability Testing for Prototypes of the Things in the Internet using Augmented and Virtual Reality

Patrick Harms, Jens Grabowski

Abstract

The Internet of Things is not only challenging with respect to pure technical aspects. Also the users of the “things” in the internet need to be able to configure IoT devices, to use them, and to understand and solve occurring problems. With an increasing complexity of the devices due to their increasing capabilities, this becomes a more and more challenging task. To ensure, that the devices in the Internet of Things are usable, their developers need to apply usability testing techniques during the development process. For this, potential users of devices are asked to use a prototype of a device. This helps the developers to get some feedback on the device’s interaction design. This feedback can be gained by interviewing the users or asking them to fill in questionnaires. In addition, the detailed actions that users take on the prototype and which can be observed by the device’s development team are a very good source for further analysis. During device development, usability testing should be applied already at early device prototypes. This allows finding usability problems as early as possible. Augmented and Virtual Reality (AR/VR) are possible vehicles to apply usability testing already on pure virtual representations, i.e., virtual prototypes, of devices under development. For this, a virtual prototype of a device is presented to a user in a virtual world. The user can then describe, how (s)he would interact with the device for executing a certain task. It is also possible to provide basic interaction capabilities so that users can actually try out their intended actions and, through this, get an improved impression on how the device shall work and shall be used. This also allows for simulating challenging user situations, e.g., how the device reacts, if a network connection is unavailable and if or how the user can handle this. Using a Google Cardboard or similar equipment, AR/VR can nowadays be experienced by almost any owner of a smart phone. Hence, performing usability evaluations of devices under development could be done in a large scale by recruiting a wide number of potential device users to perform an AR/VR-based usability evaluation. This can also be done in a remote setting, so that users do not have to come to a laboratory for usability engineering, but can perform the test at home or wherever they want. But the advantage of a larger number of test participants comes with the disadvantage that a lot of information about the usage of the device’s prototype will be lost. For example, observing the usage cannot be done anymore by simply standing next to the test participant. Instead, other observation approaches must be taken. One possibility is to log the usage of the virtual prototypes on the level of individual user actions in the AR/VR. This results in lists of user actions performed in a certain scenario. Because of the larger scale, in which AR/VR-based usability testing can be done, it is also possible to trace a large number of users and their actions on the virtual prototypes. This in turn opens the field of applying an automated usability testing as possible for websites and other software as long as sufficient usage data is available. In the presentation, we will briefly describe the overall scenario for applying automated usability testing on virtual device prototypes by using AR/VR. For this, we will sketch the basic process and mention important prerequisites, such as a minimum number of users. We will also show examples of product prototypes in AR/VR, how an interaction with theses prototypes is possible, and how the respective user actions can be logged. Furthermore, we will outline how the tool AutoQUEST (Automated Quality Engineering of Event-driven Software) can be used and extended to perform an automated usability testing based on the recorded data.
Document Type: 
Presentations
Language: 
English
Howpublished: 
presented at User Conference on Advanced Automated Testing (UCAAT) 2017
Organization: 
ETSI
Month: 
10
Year: 
2017
2024 © Software Engineering For Distributed Systems Group

Main menu 2