ABOUT THE PROJECT
In the Fall of 2015, I worked in a group to conduct a user experience review of the Waterloo LEARN mobile website. This project was completed for academic purposes as part of DEI 615, a user experience course at the University of Waterloo.
Waterloo LEARN is a web-based learning management system used at the University of Waterloo. The system allows instructors to manage course materials, activities, and assessments and to interact with their students. For the purposes of our user experience review, we chose to focus on the mobile experience provided by the Waterloo LEARN platform.
GETTING STARTED
Our research process began with forming the hypothesis that the Waterloo LEARN mobile experience was not optimal in terms of usability, design, and available range of features as compared with the desktop version. This hypothesis was based on our own personal experience using Waterloo LEARN on a daily basis, and served as the springboard for our research. Since our project team consisted of only 5 members, we were mindful of the need to remain unbiased, and needed to gather more information from a wider range of users and data sources to support or challenge our assumptions.
EXPLORING WEB ANALYTICS
We explored online analytical data for Waterloo LEARN, which included information such as the number of sessions, bounce rate, average session duration, and page visits per session.
After analyzing and grouping the information, we noticed that the mobile platform had ranked significantly lower compared with the desktop site. The number of sessions on the mobile platform was low. Between March and October 2015, the percentage of mobile platform sessions was only 7.9% of the total number of visits. According to our research, in 2014, 58% of Canadians connected to the internet with a smartphone; this trend was not consistent with what we were seeing in the Waterloo LEARN metrics. We also discovered that the users spent significantly less time on the mobile platform, and the bounce rate for the mobile platform was high compared with the average bounce rate of other websites. Of course, low mobile visit percentages, short visit duration, and a high bounce rate were not necessarily indicative of a problem, as the analytical data could be interpreted in a number of different ways. We needed further research to understand the root causes of the symptoms that we were seeing.
CONDUCTING A SURVEY
A surveys was a good starting point for conducting our user research. Aside from helping to gather useful data about target user demographics, as well as users’ needs and preferences, surveys also help establish a context around analytical data, like website metrics, and they make it easy to gather data quickly from a large set of individuals. Having reviewed the Waterloo LEARN user metrics, we wanted to find out how the mobile LEARN experience compares to the desktop experience. We hypothesized that students were often accessing LEARN on their mobile phones. What kind of a user experience did the mobile LEARN website present?
When designing the survey, we’ve followed the best practices for creating effective survey questions, including choosing only relevant questions, working hard to avoid personal bias, keeping the survey short and precise, using question formats that enable straightforward analysis, and leaving room for respondents to include information that we haven’t anticipated.
We’ve distributed the survey to approximately 250 University of Waterloo students. The 52 full survey responses that we got back showed the following trends:
- The respondents were well distributed across all years of study, from first year students to graduate students. This was important because we wanted to gather responses from individuals whose experience levels with Waterloo LEARN varied. A first year undergraduate student might not be as comfortable with the Waterloo LEARN site as a fourth year student who has had time to get used to this particular learning management system over the years.
- The smartphones of choice were Android devices (53%), closely followed by iPhone devices (43%). 2% of respondents reported that they are using a BlackBerry device. This information is important in the product development phase of the project. If we were to prototype and develop a mobile Waterloo LEARN app, we would need to focus our efforts on the two popular platforms.
- Most students prefer accessing Waterloo LEARN on their computers, but reported accessing Waterloo LEARN on their smartphones 2-3 times a day.
- Many students reported issues with the mobile Waterloo LEARN experience, including being unable to access their grades and other important information.
- Respondents pointed out that the visual design of the mobile site is poor, and does not reflect the look and feel of the desktop site.
- Several respondents indicated that they would prefer to have a Waterloo LEARN app instead.
FACILITATING USABILITY SESSIONS
Conducting usability sessions provided us the opportunity to test and confirm our assumptions and research findings. Using the findings from the survey, we created six scenarios that we asked our session participants to complete in order to further explore the challenges and pain points that we discovered. We recruited a small but diverse set of participants for this part of the study to ensure that we covered a broad range of experiences and perspectives in our analysis.
FINDINGS
The findings from the analytics, survey, and usability testing sessions consistently showed that the mobile Waterloo LEARN website needs improvement. The results from the analytics data suggested that the mobile platform is not performing optimally (when compared to other platforms, like the desktop). The survey confirmed our assumption that most students prefer accessing Waterloo LEARN on their computers. However, the same students reported accessing Waterloo LEARN on their smartphones 2-3 times a day. The observation sessions confirmed that the LEARN mobile site was not user-friendly for students, and gave us further insight into potential areas for improvement, including a more consistent visual design, accessibility, added essential features (such as grade viewing), and a more intuitive layout of information that embraces the three-click rule.