A reporting tool that shows aggregated data from course and instructor evaluations.
The University of British Columbia (UBC) conducts online evaluations that allow 60k+ students to assess 5k+ instructors and 10k+ course sections yearly. Data collected during the evaluations are used to decide whether instructors should receive promotion or tenure.
MicroStrategy was employed to generate reports containing evaluation data. Producing these reports was time-consuming as they had to cater for each faculty's needs. Moreover, comparing the performance of instructors that teach course sections offered by different faculties was difficult due to the report diversity.
SEoT Datamart, an Oracle Business Intelligence Enterprise Edition (OBIEE)-based solution, replaced MicroStrategy as a reporting tool that enables faculty, school, and department administrators and deans to see aggregated data from evaluations. Furthermore, the number of reports was reduced. Thanks to the new solution, reports containing instructor and course data can be made available shortly after the end of the evaluation period.
UBC IT developers and database analysts were responsible for implementing the new reporting tool, whereas the Centre for Teaching, Learning and Technology (CTLT) ensured faculties' needs were fulfilled by the new tool.
A CTLT colleague and I designed and conducted a usability test to verify if the new reports would allow users to achieve their goals
We created three documents: a list of tasks that should be carried out by the usability test participants, a script followed by UX researchers to ensure all participants would receive the same information and a spreadsheet to record the usability issues found. In order to improve these documents and estimate the duration of each usability session, we conducted a pilot study.
The usability test took place in a prepared room. As UBC staff uses different operating systems (Windows, macOS and Linux), participants were asked to bring the laptops they use at work. One of the researchers interacted with participants, while the other recorded the usability issues found and the comments made by individuals.
Participants were given a sheet of paper containing the tasks to be performed. They were also asked to verbalize their thoughts while interacting with the system. This method, known as think-aloud protocol, provides researchers with insight into a participant's cognitive processes.
After the usability test sessions, we wrote a report which described the usability issues found and their severity levels (cosmetic, minor, major or catastrophic). The report was used by UBC IT to allocate resources to improve the interactive reports.