1. OVERVIEW
  2. Problem Definition
  3. REDESIGN
  4. Heuristic Evaluation
  5. Prioritization
  6. USER Research
  7. DESIGN DECISIONS
  8. My Contribution
  9. RESULTS

APPLYGRAD ADMIN VIEW

UX/UI DESIGN | HEURISTIC EVALUATION | Project Management
A user-centered redesign for the administrator view of Carnegie Mellon School of Computer Science's graduate application website.
Team: Tommy Byler, Bidisha Roy, Rituparna Roy, and Amrita Sakhrani
ApplyGrad Team: Dale Shanefelt (Engineer), Haylie Toth (Front-end developer)
April 2020 — May 2020

MY ROLE

  • Product Designer
  • Project Manager
  • Client Communication

CONTEXT

Six-week UX consultation opportunity with the School of Computer Science at CMU to redesign their application system, ApplyGrad. Conducted remotely during the COVID-19 pandemic.

METHODS

  • Heuristic Evaluation
  • Card Sorting
  • Think Aloud Protocol
  • Feature Prioritization
  • Low-fidelity Prototyping
  • High-fidelity Prototyping
Our team of five master’s students was tasked with rapidly identifying high-impact leverage points for the ApplyGrad admin view user experience. Based on contextual inquiry, heuristic evaluation, and user testing, we redesigned the core landing page experience, handing off a high-fidelity prototype and design specifications for implementation. 

Problem Definition

ApplyGrad, the web portal which the CMU School of Computer Science uses to process all graduate and PHD applications, was outdated, leaving many opportunities for an improved user experience.
Experienced administrators had found their own workarounds, but the limitations of the platform added up to sizable frustrations and inefficiencies over the course of yearly 9-month application cycles, forcing them to rely on external tools, train others, and perform manual data tracking.

REDESIGN

With our redesign, we focused on revamping the landing page experience.
Our heuristic evaluation and user interviews revealed clear opportunities to reduce the clutter of the home screen and provide important metrics at a glance.
We revamped the global navigation structure, adding a new 'Home' dashboard page with important stats and quick links and dividing the remaining sections into their own landing pages.
> SCREENS
> INTERACTIVE PROTOTYPE

Heuristic Evaluation

Our team conducted individual heuristic evaluations of the system, utilizing Nielson and Molich’s 10 heuristics.
After reviewing our findings and discussing the severity of violated heuristics, we generated 32 opportunities for improvement, including design recommendations. The most common violated usability heuristics were aesthetic & minimalist design and flexibility & efficiency of use.
Clustering our recommendations into groups revealed our key findings:
  • The system has inconsistent, unclear visual styles, naming conventions, and hierarchies
  • Multiple roles force users to switch to accomplish different tasks
  • The interface does not support progress tracking
  • The interface does not support certain bulk actions
  • Organization of lists is unintuitive
  • Lack of feedback from user actions requires additional manual cross-verification 
  • The current workflow does not match the user’s mental model of the admission process
Many of the recommendations that stemmed from these findings aimed to directly ease cognitive load and increase use efficiency. 

Prioritization

We used an impact-effort matrix to plot the value of each design recommendation against the development effort, working collaboratively with our administrator and developer partners to validate our assumptions.
From this prioritization exercise, we aligned on a narrowed scope for where we could provide most value in our project’s short timeframe, addressing information hierarchy, information overload, and progress tracking through a redesigned landing page experience.

User Research

Through a variety of research methods, we quickly generated ideas for UX improvements and elicited valuable qualitative feedback.
> CONTEXTUAL INQUIRY
We set up a remote contextual inquiry session with an administrator super-user, where we saw and heard first-hand how the administrator workflow adapts to the limitations of the current ApplyGrad platform. This session also solidified that an administrator’s work is divided into three separate, sequential phases:
  • Application, from September to January. In ApplyGrad, administrators monitor each document for incoming applications, contacting applicants to answer any questions and ensure they are on track to meet deadlines and requirements
  • Evaluation, from January to March. Administrators use ApplyGrad to evaluate applicants in multiple rounds, assigning faculty reviewers for each applicant and tracking the reviewers’ progress, culminating in a final list of admitted, waitlisted, and rejected students
  • Admission, from March to May. In the final stage, administrators use ApplyGrad to monitor student responses as they finalize student acceptance.
> CARD SORTING
Using Zoom and Mural, we conducted a virtual card sorting activity with a super-user administrator who has used the system for 15 years. Through two sorting tasks, we aimed to understand the user’s mental model for organizing the various tasks and touch-points in the admissions process, as well as revealing which home screen actions are must-haves, should-haves, or could-haves.
> CONCEPT TESTING
Armed with the knowledge from our heuristic evaluation and our contextual inquiries, we created a series of 12 designs and conducted A/B testing to identify which aspects and features resonated more than others, and why.

"A lot of our work is data-driven."

Administrator at the School of Computer Science
Overall, the tests validated the modern visual approach and navigation restructuring that would serve as a basis for our final iteration. The testing also revealed what wasn’t working in our designs, such as the timeline visualization on the dashboard and certain functionalities such as calendar, recent activity, and favoriting functionalities which were deemed less necessary.

DESIGN DECISIONS

Dividing the screens into their own landing pages, which was core to our redesign, was driven by two key findings.
1. Reducing clutter on the home screen; both user interviews and our heuristic evaluation pointed to the home screen feeling unmanageable.
Calls to action are small, close together, and plentiful, including information from previous academic years and showing the user all options for all sections in one screen. The sheer amount of clickable content in such close proximity meant users were faced with an unreasonable cognitive load to complete simple navigation. Our redesign addresses this through divvying up the actionable links into separate views and ensuring links have adequate spacing and hierarchy
2. Emulating users' mental models of the admissions process through a sequential workflow
Through our contextual inquiries and user interviews, we learned that the admission process was highly sequential. While it’s important to always have access to all parts of the system, users will spend the majority of their time in one of the three sections during any given point of the year.
Furthermore, some key design decisions led to the new dashboard page.
3. Providing quick visibility into each part of the system, grounding the user through data lists and visualizations
This was a feature that users did not know they needed, but one they responded to unanimously positively once it was presented to them. In the existing system, they are able to find these data points, but only after digging through several internal pages and locating the numbers. For key data points on the dashboard, some users preferred visualizations while some users preferred a more straightforward list; on the final dashboard design, we were able to accommodate for both cases by ensuring the charts’ legends could serve as clear scannable lists.
4. Integrating a calendar– not for upcoming milestones but to compare and contrast with past ones
A full calendar functionality was tested and was not as necessary, but users responded well to being able to retrieve data from past dates. We confirmed with the development team that this was possible given the way the data is stored, making it a powerful way to incorporate currently underutilized, useful data. We also ensured to make it clear when users were viewing a past date, changing the background color and adding a clear CTA front-and-center to get back to the present date.

My Contribution

The five of us all contributed substantially to the rapid user research and iterative design for this project. Individually, I also took on a project manager role, managing our partnership with ApplyGrad's developers, ensuring we stayed on track with our timeline and deliverables, and owning the design and functionality of the new home screen.
Leading communication with the development team included weekly touch-points, assessing the feasibility of our designs, and aligning on the necessary handoff package for implementation.
And after we split the three main sections, (Application, Evaluation, and Admission) into their own pages, we had to rethink the landing page experience. I led the creation of this new page, a dashboard providing users quick visibility into the system’s data for the current date or a date in the past of their choosing.

RESULTS

> HANDOFF PACKAGE
Our close partnership with the development team helped ensure we would be providing them a design package that fully prepares them to integrate our redesign.
The final handoff package included the final interactive prototype, a style guide, high-fidelity screens and code specifications, and the executive summary which includes rationale and functional annotations, describing exactly how the system should work and including code libraries for implementation.
> PRESENTING TO STAKEHOLDERS
We pitched our design, via Zoom, to two key groups of stakeholders: first, the ApplyGrad steering committee, and then the Master’s Heads committee.

“To be able to see at a glance how many applicants have actually made decisions, that’s amazing. That’s very exciting.”

Administrator at the School of Computer Science & Applygrad user
From the first meeting, we heard from excited administrator users who were eager to have the redesign implemented. In the Master’s Heads group presentation, we received buy-in from the decision-makers once it was clear how our design package adequately prepared the development team for implementation.

“A really terrific improvement.”

Head of SCS Masters Programs
Pending funding approval, the School of Computer Science will be moving forward with our redesign as a basis for refreshing the ApplyGrad admin view.