"Just as technology can provide satisfaction with a job well done, these programs can also produce frustration, stress, and anger when it fails to function or is difficult for users to operate."
Breakdown of user suggestions collected after using Tutortrac
Tutortrac, a management application, is designed to provide “on-demand access to essential tools, such as appointment scheduling, logging visits, and activity reports” (Trac Systems). Unfortunately, the Ball State Learning Center Tutortrac often does not perform to these standards. Meghan Clark, an undergraduate tutor, states, “Tutortrac often shuts down and doesn’t let me view my tutoring times. It results in the client losing valuable tutoring time because by the time the desk figures out I had a new client, we’ve wasted 10 minutes of our 50 minute tutoring session.” Elizabeth Fallon, the Tutoring Coordinator, also notes that the interface is not user-friendly and takes a significant amount of training for staff members to use. From the standpoint of a student visiting the center, Fallon states “it is easy for a user to make a mistake that cannot be corrected.”
In order to address these complaints, this user experience case study focuses on the current use of Tutortrac at the Ball State Learning Center. The specific goal of this research is to evaluate Tutortac’s scheduling function through user experience methods, creating a better understanding of the problem areas and generating a report on the success of the system’s design. Based on surveys, eye tracking, and heuristics, this study composes a list of suggested variations to the design of Tutortrac. To investigate the accuracy of the suggested design changes and their impact on the overall user experience, this project compares low-fidelity prototypes through A/B testing. These findings can be taken into account in the next iteration of TracSystems management applications, which should be updated to better serve its user communities like the Ball State Learning Center.
Access the full paper here.