Product & UX Design · Case Study

UofT Resource Booking System Redesign

Transforming a cluttered, overwhelming interface design into an efficient, intuitive, and easy-to-use experience.

Role UX Research, A/B Testing, Prototyping, User Interviews, Workshop Facilitation
Context Course Project
Timeline 4 weeks · 2025

In this project, I...

  • Led a team of three to redesign a high-traffic university booking system used by thousands of students
  • Ran heuristic evaluation to identify key usability failures, including high cognitive load and data-reset bugs
  • Designed a requirements-driven recommendation flow to replace inefficient manual table scanning
  • Conducted A/B testing with 16 participants across 5 groups to validate design decisions quantitatively
  • Reduced average task completion time by 53.8 seconds, with 100% of testers preferring the redesigned experience

Overview

The UofT Resource Booking System is a critical tool for thousands of students, but its interface design has become outdated and is quite difficult to use. This project applied Don Norman’s design principles and Nielsen’s usability heuristics to rebuild the system from the ground up.

Heuristic analysis allowed me to identify several usability barriers, specifically the heavy cognitive load of manual searching and selection-reset bugs . Leading a design team of three, I re-engineered the interface from a static grid into a requirements-driven recommendation system.

The Problems: High Cognitive Load & Bugs

Heuristic evaluations and user testing identified three critical pain points that made the original system "frustrating and inefficient":

01

Static Table Search

Availability was presented in a static table, forcing users to scan manually for avaliable rooms.

02

Navigation & Data Reset Bug

Switching between campus buildings reset the date to the current day, forcing users to repeatedly re-enter search criteria.

03

Weak Signifiers

Primary buttons were too small to be easily noticed.

Legacy system UI audit

First version of a mid-fidelity Prototype & A/B testing

To address the identified pain points. I developed a series of mid-fidelity wireframes. These wireframes focused on automating the search process.

Mid-fidelity wireframe overview

With the mid-fidelity prototype ready, we conducted A/B testing involving 16 participants across 5 testing groups. The sessions focused on comparing the legacy interface with our redesigned flow to measure efficiency and clarity. Some valuable insights were found in the testing:

01

Preference-Led Discovery

A significant majority of testers (4/5 groups) found that using the filter-based search was far more intuitive than manually scanning the original static table.

Filter preference pie chart
02

Phrasing Confusion

We introduced a new "Waitlist" feature to manage peak-hour demand; however, the specific button phrasing caused confusion for 3 out of 5 testing groups.

Waitlist notification button UI
03

Reduced Task Time

Quantitative data confirmed that the redesigned flow significantly reduced task completion time, saving an average of 53.8 seconds compared to the legacy system.

A/B test usage time comparison table

The Final Deliverable

Guided by A/B testing insights, I iterated on the design once more and successfully delivered the definitive version of the UofT Resource Booking System. This versionsignificantly elevates user experience and offers high-value improvements in the following areas:

01

Modern Minimalist Aesthetics

Transformed the outdated, cluttered portal into a clean, contemporary interface. The redesign provides more breathing room to users by having more white spaces.

Enhanced Visibility and breathing room
02

Requirements Based Recommdations

Replaced manual table scanning with smart recommendations. Users can now input their needs, and the system instantly surfaces relevant options, providing an effortless discovery process.

Automated requirements-based search
03

Enhancing Data Persistence

Resolved the critical data-reset bug by implementing persistent filters. User filter criteria and search progress now persist across different screens, eliminating the need to re-enter data.

Persistent filters and seamless navigation
04

Optimizing Wordings

Optimized all button labels and interactive feedback to eliminate the phrasing confusion found during A/B testing.

Optimized microcopy and waitlist feature

Optimized Booking Workflow

The redesigned end-to-end workflow streamlines the user journey from initial search to final confirmation. User testing confirmed that this optimized flow saved users an average of 53.8 seconds compared to the original system.

Redesigned Booking Workflow

Enhanced Visibility & Signifiers

Drawing from Don Norman’s principles, the new design improves the visibility of system status and provides clear signifiers for user actions. The system ensures users always understand the current state of their requests.

Enhanced Visibility and Feedback

The results from our A/B tests were definitive: 100% of testers preferred the new design.

Interactive Prototype

Explore the redesign below. :)

Reflection

One of my most important takeaway from this project is came from the continuous testing and iterative cycles, where I learned how to ground design decisions in empirical user data.

Another key takeaway from this experience was the value of uncovering unexpected insights through A/B testing and qualitative interviews. While my initial focus was on optimizing single-room reservations, user feedback revealed a critical need to support more complex, real-world scenarios—specifically, combining bookings across multiple rooms to cover a continuous time slot. This insight reshaped my understanding of efficiency: it is not solely about speed or simplicity, but about flexibility and adaptability in complex real life contexts. As a result, I am now more intentional in evaluating how designs perform in real life scenarios and how well they support users navigating complexity.