Augmented Reality Exhibit Usability Test

In July 2018, we conducted a usability test on ARt: Exhibition in 3D, an augmented reality (AR) art exhibit in the Technology Showcase in Hunt Library.

Overview

In July 2018, 11 participants participated in a usability test of an augmented reality art exhibit in the Technology Showcase in Hunt Library called ARt: Exhibition in 3D. The focus of the study centered on how participants interacted with an installed handheld mobile device and Snapchat app for viewing the digital models. A successful interaction required a person to pick up a device, open Snapchat, point the device at a colorful target (or snapcode), hold their thumb down and scan the digital model, and select the correct button choice for the model to pop up on their screen.

Research Questions

  1. Do the phones work well?

  2. Are the participants successful at scanning the snapcodes?

  3. Do the participants know what buttons to press in the Snapchat App?

  4. Are participants successful at getting to see the digital models?

  5. Do participants look at the written directions?

Recommendations and Changes

Overall, participants responded positively to the exhibit, but most encountered problems interacting with it on the first try.  An ideal change to the exhibit would be to have a video playing on a screen next to the exhibit showing the process for how to scan and view models using the Snapchat app. This would be difficult to implement as it would require installing a device capable of playing a video on loop during library hours that could be located at the exhibit. An alternate approach could be updating the visual instruction card to show how to view the models without having to select from the button choices.

Having future exhibits based on Snapchat may be ill-advised due to usability issues with the Snapchat app. Even users highly familiar with Snapchat encountered problems getting the models to load into the view.

How We Did It

We conducted usability tests with 11 participants. Each test took less than five minutes, and we incentivized participants using candy bars. One staff member recruited passersby near the Technology Showcase at Hunt Library. A second staff member facilitated, reading the usability test script to participants. A third recorded notes using an online data entry form.

What Worked Well

  • Mobile device position. Participants did not seem to have difficulty physically holding the mobile device and pointing it at the snapcodes.

  • Button choice for unlocking the models (under certain conditions). After scanning the model, most participants pressed the intended button ("Take a Snap"). This was true for participants encountering a locked model (and less true for those who saw unlocked models).

  • Interacting with the model once scanned. After the model successfully popped up, six participants intuitively tried interacting with it using hand gestures and moving the phone around. These participants tended to be familiar with Snapchat.

Issues

  • Knowing what to click on. Participants who had never used Snapchat tended to open the app and wait for instructions rather than trying out interactions. The process of scanning a snapcode was not intuitive.

  • Failure to properly scan the snapcodes. Nine out of eleven participants failed on their first try to scan the snapcodes. One participant was never able to load a model. A common issue was placing the thumb in the incorrect location on the screen (over the button instead of the Snapchat icon). Six participants did this, which resulted in either taking a video or a static image. Five took a video and one took a static image instead of scanning the model.

  • Zooming in too far while scanning the snapcodes. At least two participants  failed to scan the snapcode properly because they did not have the full code in view -- the phone was held too close to the code and parts of the code were out of view.

  • Not understanding button choices after scanning the model. After scanning models that had already been unlocked, participants were confused with the button choices (“Take a Snap” or “Send to a Friend”). The correct choice was “Take a Snap,” which would pop the model into the view, but multiple participants seemed confused by the choices.

  • Finding the model after scanning it. Once the model popped into view, at least two participants did not see it and re-scanned the model.

  • Reading the instructions. None of the nine participants who failed scanning the models consulted the instruction card placed in the exhibit for help. This was perhaps a result of how the usability test was conducted. They looked to the person guiding them through the usability study for help.

  • Interacting with the models. Two participants had to be prompted with a question, “Is there anything you can do with it?” before interacting with the models.

Suggestions from Participants

  • Make the model interactions more interesting. Participants felt that models should have fun interactions (as are common on professionally-designed models in Snapchat), such as fly, change color, etc.

  • Having a visual instruction sheet placed beside each model, rather than having instruction sheets placed in only two locations in the exhibit.

  • Have the model interactions be more consistent from model to model. Some models allowed for clicking and dragging, others did not.

Team