MathWorks

UX Research • Accessible Design • Sponsored Client Project

How might we design a digital solution to enable lab instructors to create MATLAB assignments more quickly and easily?

The Challenge

Overview

MATLAB is a programming platform developed by MathWorks that enables lab instructors and students to solve scientific problems. For this client project, we were tasked to develop a platform to assist lab instructors with the difficult processes of the creation and grading of MATLAB assignments.

During this project, I led the user research and strategy for a digital repository assignment marketplace with an in-app points system that allows lab instructors to publish, browse, and purchase lab assignments to aid assignment creation. We ensured that our solution adhered to the Web Content Accessibility Guidelines (WCAG) Level AAA. Finally, our solution improves the lab instructors’ teaching experience, in turn improving the learning experience for students also.

My Role
Lead UX Researcher
UX Designer

The Team
2 designers, 2 researchers
Guided by 1 senior designer and 1 senior researcher

Timeline
August - December 2023

Skills
Accessible Design
User Research
Usability Testing
Ideation and Storyboarding
Wireframing and Prototyping

Tools
Figma
FigJam
Miro
Notion
Procreate
Qualtrics

Main features of our final design

Publishing assignments to the marketplace

Users can fill in relevant details and publish their own assignments to the marketplace. From this, users earn in-app points, which can be later used to purchase other users’ assignments.

Onboarding flow for the in-app points system

When users first enter the marketplace, they encounter this onboarding flow that helps them to learn about the in-app point system, including how to spend their points and where to find their point balance.

Purchasing assignments from the marketplace

Users can browse the marketplace to purchase other users’ assignments. Before purchase, they can review assignment ratings, student grade statistics and also preview the entire assignment in JPEG format.

How did we approach the problem space?

To narrow the problem space, we started with 5 semi-structured user interviews with lab instructors in Georgia Tech. We chose to start with this research method as it would enable us to gain a deep understanding and identify common pain points during both the assignment creation and grading processes. In addition, we also interviewed 4 teaching assistants and 2 graduate students to develop a holistic understanding of the problem space.

To better understanding of user context, we embedded contextual inquiries into our user interviews, where we observed lab instructors in their natural setting during the assignment creation process.

We then utilized a hierarchical task analysis to simplify and break down the complex tasks of assignment creation and grading. By doing so, we were able to evaluate each individual step taken and identify opportunities for improvement in both processes.

To supplement our primary research, we conducted a comparative analysis of indirect competitors (MATLAB grader, Coassemble, Colab.Google and zyBooks). By examining these competitors’ strengths and weaknesses, we identified gaps in the market and gained inspiration for our ideation process.

Focusing in on assignment creation

Results from the semi-structured interviews and contextual inquiries were analyzed with an affinity map. This helped us to familiarize ourselves with the data and to identify the major themes in both processes.

Based on our research findings, we identified assignment creation as a higher priority than assignment grading. This is because auto-grading solutions such as Gradescope already exist to help lab instructors; and the primary problem was not a lack of solutions but a lack of awareness that these solutions exist.

Assignment creation is hard!

Through our research, we discovered that creating assignments is a time-consuming and difficult process, resulting in much frustration for lab instructors.

We knew this was an important problem to solve. Therefore, we distilled our research into the three top insights and design implications, as shown below.

Summarizing the problem

Creating MATLAB assignments is tough. It is hard to generate new assignment problems that challenge students and test concepts accurately, while accommodating for varying levels of academic ability. Furthermore, lab instructors utilize multiple sources to find relevant components, which is inefficient and time-consuming.

Putting lab instructors first

To remind ourselves of our target users, we created 2 personas with associated user stories and user journey maps.

This helped us to humanize our target users and deepen our understanding of their pain points.

Our top three ideas

Bearing in mind our design implications and target users, we began brainstorming by using the SCAMPER technique. We then voted for the top 3 ideas which were fleshed out through storyboarding and sketching.

Our ideas were (1) a forum with an in-app messaging feature, (2) a digital repository assignment marketplace with in-app currency system and (3) a modular assignment creation platform.

Based on these ideas, we gathered early feedback from MathWorks and a HCI expert. Our overall goal for this feedback was to evaluate how well our ideas address user pain points and to identify critical issues to address. More specifically, we wanted to identify the best features from our ideas, as this would enable us to narrow the scope of our solution prior to the development of low-fidelity prototypes.

Rethinking the in-app currency system

Out of the ideas, the Digital Repository Assignment Marketplace was highlighted as it is the most direct way to address the pain point of difficulty in brainstorming and generating new assignments.
Furthermore, by solving this problem, we address the root issue in difficulty in assignment creation, and can later build on this to address secondary issues such as streamlining the assignment creation process. However, MathWorks voiced concern regarding the use of the in-app currency system, which we approached as shown below.

User flows, wireframes and initial user testing

Based on the feedback and our design decisions so far, we created two main user flows: (1) for users to publish and categorize their assignments (2) for users to download and purchase other users’ assignments. From this, we created wireframes and a low-fidelity prototype, which we tested through task-based scenarios with 2 lab instructors.

The goal of this initial round of user testing was to evaluate how well participants understood the user flows and points system, as well as to identify pain points, missing information and any accessibility issues. Additionally, we wanted to obtain real-time observation of participants’ reactions as they navigated through the product, and thus held the user tests in-person.

From low-fidelity to high-fidelity prototype

Based on user testing of our low-fidelity prototype, we uncovered these main insights and made changes accordingly.

Finding: The in-app points system is confusing

Although both participants stated that they liked the concept and felt that it was a good incentive for them to contribute, both expressed initial confusion as they did not understand how to earn, spend and check on their points.

P2: “I didn’t even notice the reward system”

Solution: An onboarding flow

In order to help users understand the in-app point system, we designed an onboarding flow to explain how to earn points, what they are used for, and how to locate your current point balance.

Finding: The assignment preview does not have enough information

Both participants felt that the assignment preview was insufficient for them to decide whether to purchase the assignment. Participants wanted more details to assess the quality of the assignment, such as a ratings or review system.

P1: “You have to make sure you will get what you want to get before spending the points”

Solution: Adding assignment ratings, student grade statistics and a full preview

We added a rating system and student grade statistics so users can make a more informed choice before purchasing assignments. Additionally, users can view the full assignment with watermark in JPEG format; thus users can view but not copy or edit the assignment. Users only receive a PDF copy upon purchase, which can later be edited for creation of their own assignments.

Finding: Accessibility issues

Participant 2 struggled to find the cost of the assignment as the button and text were too small in size and too light in color.

P2: “The font is too small, too light”

Solution: Increasing visibility, following WCAG

To increase the visual emphasis of the cost of the assignment, we added a blue border and icon to the points. We ensured that the colors used satisfied the WCAG Level AAA.

Testing our final design

First Phase

Expert testing

Moderated sessions with 3 HCI experts
- 2 cognitive walkthroughs
- 1 heuristic evaluation

We began our evaluation with expert testing to first identify major usability or learnability issues that needed to be addressed prior to user testing. The goal of the cognitive walkthroughs was to evaluate the learnability of the platform, especially as the points system may be novel and confusing to new users. We also used the heuristic evaluation method to identify any major usability issues.

Second Phase

User testing

Moderated sessions with 4 professors
- task based-scenarios: publishing and purchasing assignments
- pre and post session interviews

We then moved onto user testing, which aimed to get quantitative and qualitative feedback to evaluate the usability of our two user flows. With this method, we also wanted to evaluate the overall functionality of the platform and assess how well it addressed the user pain points of difficulty in brainstorming for assignments.

Results of user testing

Success metrics

All participants were able to complete both tasks successfully without help from the moderator.

On average, Task 1 was more difficult than Task 2.

Qualitative feedback

Overall feedback

The Good: Participants had a good understanding of the functionality of the platform and liked the idea and concept.

The Bad: Participants had security concerns about students getting access to the platform and cheating. Participants also wanted to use cash for purchasing instead of using points.

Task 1: Publishing assignments

The Good: Participants stated that it was easy to use and intuitive. They liked the in-app points system.

The Bad: Participants wanted more guidance on how to fill in assignment details for categorization. Additionally, they wanted a ‘quick publish’ option so that they did not need to go through the entire publishing process.

Task 2: Purchasing assignments

The Good: Participants liked the format of the assignment preview and felt that the amount of information was sufficient to make a decision regarding purchase.

The Bad: Participants felt the wording of ‘Marketplace’ was too transactional and not in keeping with the academic theme. Participants also wanted a cleaner interface to reduce cognitive load.

Final thoughts and reflections

During this project, I was reminded of the importance of human-centered design. As we went through the design process, we realized that we had deviated from the initial client directive, which was more specifically to develop a graphical user interface (GUI) for MATLAB assignments. However, our redesigned solution still effectively tackled the core issue of aiding lab professors in assignment creation, and was a more direct solution to addressing the main user pain points.

This is validated in the fact that one lab professor even asked us to keep working on it and develop it into an actual real-life product! In future, I would love the opportunity to continue this work and iterate on the user feedback as above.

Lastly, I am incredibly thankful to my fellow team mates and MathWorks for all the help and guidance along the way!