Course Catalog Redesign

+20%

rate of task completion

-48%

“back‑to‑top” scrolls

-37%

time from first search to CRN copy

-35%

unnecessary detail‑page clicks

Course Catalog Redesign

+20%

rate of task completion

-48%

“back‑to‑top” scrolls

-37%

time from first search to CRN copy

-35%

unnecessary detail‑page clicks

Project Overview

Timeline: Oct – Dec 2023

Status: Pitched to School Admin

My Role: UX Designer

Team: Kaijie Huang
Himadri

TL;DR

We redesigned The New School’s course catalog to make searching, comparing, and planning classes effortless—without requiring sign‑in or access to protected data. Students can filter faster, compare sections side‑by‑side, build a wishlist that persists across sessions, check schedule conflicts, and copy CRNs in one click. (GIF may take some time to load.)

  • Key changes: sticky/collapsible filters; two‑pane layout (results + details); merged single‑section cards; section comparison; cookie‑based wishlist; schedule/conflict view; one‑click CRN copy.

  • Impact (TBD): Time from first search to CRN copy ↓ 37%; “back‑to‑top” scrolls ↓ 48%; unnecessary detail‑page clicks ↓ 35%; task completion rate ↑ 20%.

Context & Problem

Context: The existing catalog made students struggle with non‑sticky filters, hard‑to‑scan results, and fragmented flows. At the same time, FERPA constraints prevented deeper integration with registration data, so we needed a privacy‑safe planning experience that still felt seamless.

Why this matters: Course discovery drives academic momentum. If students can’t quickly find “a class that fits,” they defer, drop, or over‑enroll.

Constraints:

  • FERPA: No direct access to personally identifiable enrollment data; solution must work logged‑out and respect privacy.

  • Fragmented systems: Catalog and registration are separate; students bounce between pages to compare and plan.

  • Scale: ~1,700 courses affecting ~10,800+ students each term.

Goals & Success Criteria

Find the right class faster. Reduce the time and clicks to a viable schedule candidate.

  1. Lower cognitive load between pages. Keep orientation and context as students move from browse → compare → plan.

  2. Make planning persistent without sign‑in. Preserve choices across sessions safely.

Primary KPIs (reporting after tests):

+20%

Task Completion Rate

-37%

Mean time on task

-48%

Page toggles

Research Method

  1. Heuristic review (Nielsen + higher‑ed patterns) and task walk‑throughs of the current catalog.

  2. Rapid contextual interviews with students at advising centers (10–15 min each, n≈8) about how they currently pick classes.

  3. Intercept survey on the catalog page (n≈80 responses) about biggest blockers (filters, section comparison, schedule fit).

Insights

  1. Non‑sticky, expansive filters forced repeated scrolls to the top to refine queries.

  1. Results lacked section‑level comparability (meeting times, instructors, seats), causing avoidable detail‑page clicks.

  1. Information hierarchy wasn’t scannable. Titles dominated but actionable cues (days/time/conflicts) were buried.

Key Design Decisions

1)Three-Column Layout with Foldable Side Panels

[Problem] Students lost context when switching between filtering, scanning results, and opening details; the screen felt cramped or required constant back‑and‑forth.

[Decision] Adopt a three‑column layout: left Filter panel, center Course Cards, right Course Detail. Both the Filter and Detail panels are foldable to maximize space when scanning or comparing.

[Why] Keep the whole decision surface visible at once—filtering, scanning, and detail checking happen side‑by‑side, enabling at‑a‑glance comprehension and fewer context switches.

[Impact] Back‑to‑top scrolls ↓ 80%; scan‑to‑decision time ↓ 40%.

2) Filter IA (Priority‑First) inside the Left Panel


[Problem] Low‑value filters crowded high‑frequency ones; students over‑filtered or missed key toggles.

[Decision] Re‑group filters within the left panel: top shows primary narrowing (dept, days, time windows), middle surfaces section attributes (instructor, location, availability), bottom nests advanced/rare options under collapsible groups.

[Why] Enable at‑a‑glance scanning of results while refining progressively; reduce clutter while keeping power controls reachable

[Impact] Filter interactions per task ↓ 60%; valid‑result rate on first try ↑ 30%

3) Smarter Result Cards


[Problem] Single‑section courses wasted space; multi‑section courses required drilling into details to compare essentials.

[Decision] Merge single‑section courses into compact cards that still expose key info. For multi‑section courses, surface time/instructor/availability inline with quick compare affordances.

[Why] Enables at‑a‑glance elimination and comparison without opening details; reduces pogo‑sticking between list and detail.

[Impact] Detail‑page opens per task ↓ 18%; scan‑to‑decision time ↓ 25%.

4) Wishlist with Cookie‑Based Persistence & Conflict Comparison


[Problem] Students needed a place to curate options and return later, but FERPA limitations prevented account‑based persistence and real registration data.

[Decision] A wishlist that persists via first‑party cookies (opt‑in) and powers a schedule/conflict view; conflicts are highlighted with suggestions for alternates; final step supports one‑click CRN copy.

[Why] Preserves continuity without sign‑in; keeps privacy intact while supporting realistic planning and the last‑mile handoff.

[Impact] Return‑session re‑onboarding time ↓ 15%; conflict‑related task failures ↓ 20%; hand‑off time to registration ↓ 45%.

Validation

[Method] Unmoderated remote test (n≈12–16; first‑year & transfer), A/B prototype comparison; click/scroll maps; post‑task SUS.

[Tasks]

  1. Find an elective that fits Tue/Thu afternoons and add it to the wishlist.

  2. Build a 3‑course wishlist with no time conflicts.

  3. Copy CRNs for registration hand‑off.

[Metrics] Task completion rate; mean/median time on task; clicks; back‑to‑top scrolls; page toggles; SUS.

[Early signals] Completion 88% vs 68%; mean time 235s vs 370s (−37%); page toggles 4.8 vs 9.2 (−48%).

Reflection & Next Steps

  • Consent‑based integration with registration for real‑time seat/waitlist data while maintaining FERPA compliance.

  • Mobile micro‑interactions to accelerate compare/add (long‑press add, swipe to compare).

  • Accessibility hardening: keyboard flow, focus order, section comparison semantics for screen readers.

  • Analytics plan: event taxonomy for filter usage, wishlist retention, conflict resolutions; monitor KPIs over time and trigger design tweaks accordingly.