UX/UI Product Management March 2026

Forge: Redesigning the Enterprise Design Review Experience

A product management case study — from discovery research through redesign, roadmap, and projected impact — for a fictional enterprise design review platform with a critical adoption problem.

§ 01

The Product

Forge is an enterprise SaaS platform used by manufacturing and consumer product companies to manage design reviews, stakeholder approvals, and revision tracking. It sits at the intersection of design tooling and Product Lifecycle Management (PLM) — where creative decisions meet engineering constraints and business sign-off.

The platform has strong enterprise contracts and deep integration with CAD and ERP systems. But it has a serious problem: adoption is collapsing. Seat utilization dropped from 74% to 41% over 18 months. Designers avoid it, stakeholders can't navigate it, and the approval workflow that justifies the product's existence is buried under layers of complexity.

My Role

VP Product, brought in to diagnose the adoption failure and lead the redesign from research through launch. Responsible for strategy, prioritization, cross-functional alignment, and outcome measurement.

The product worked. People just stopped using it. That's a design problem, not a feature problem.
§ 02

Research & Discovery

Before proposing solutions, the team needed to understand why adoption was failing. We conducted 24 user interviews across 6 client organizations, analyzed 90 days of session telemetry, and ran a competitive audit against 5 alternative workflows teams had adopted informally.

Key Findings

  • Navigation collapse. 67% of users couldn't locate the approval workflow within 3 clicks. The information architecture had grown organically over 4 years without restructuring — what started as a clean hierarchy had become a maze of nested menus and redundant paths.
  • Context switching penalty. Designers had to leave their primary tools (Solidworks, Figma, Adobe CC) to upload, annotate, and submit work for review. Every review cycle required 12+ minutes of file management before any actual review happened.
  • Notification blindness. Reviewers received an average of 34 notifications per day from Forge, with no priority differentiation. 78% of review requests were ignored for 48+ hours, creating project bottlenecks attributed to the tool rather than the people.
  • Role confusion. The same interface served designers uploading work, engineers reviewing specifications, and VPs approving milestones. None of these users shared the same mental model, yet all saw the same dashboard.
  • Shadow workflows. 5 of 6 client organizations had developed informal workarounds using Slack, email attachments, and shared drives — routing around Forge entirely for day-to-day reviews and only using it for compliance documentation.
§ 03

User Personas

Three distinct user types emerged from research, each with fundamentally different needs, workflows, and definitions of success.

Primary Creator
The Designer
Uploads work, manages revisions, responds to feedback. Needs frictionless submission and clear status visibility.
I spend more time managing files in Forge than actually designing. It feels like the tool is working against me.
Technical Reviewer
The Engineer
Reviews designs for feasibility, tolerances, and specifications. Needs side-by-side comparison and inline annotation.
I can't see what changed between versions without opening three different screens. So I just ask the designer to walk me through it on Zoom.
Decision Maker
The VP Approver
Approves milestone gates and budget allocation. Needs a 30-second summary, not a deep dive into revisions.
I don't need to see every version. I need to see the recommendation, the risk, and where to click 'approve.'
§ 04

Heuristic Evaluation

A structured evaluation against Nielsen's 10 usability heuristics revealed systemic issues across the platform. The six highest-severity findings:

C
No role-based views
All users see the same dashboard regardless of role. Designers, engineers, and executives have no dedicated entry point optimized for their workflow.
C
Buried approval workflow
The core value proposition — design approval — requires 5+ clicks from the dashboard. No persistent status indicator shows pending approvals.
C
No version comparison
Reviewing changes between design iterations requires opening each version separately. No diff view, overlay, or change summary exists.
M
Notification overload
No priority hierarchy in notifications. Routine updates and critical approval requests arrive with identical presentation.
M
Upload friction
File submission requires 7 form fields before upload begins. Most fields have sensible defaults that could be auto-populated from project context.
M
No mobile experience
VP approvers frequently need to act on approvals from mobile devices. The current interface is desktop-only with no responsive adaptation.
C = Critical (blocks core workflow), M = Major (significant friction, workaround required).
§ 05

Proposed Redesign

The redesign is organized around a single principle: every user should reach their primary action within 2 clicks of login. This required restructuring the information architecture around roles rather than features.

Role-Based Dashboards

Three distinct entry experiences replace the universal dashboard. Each surfaces only the information and actions relevant to the user's role, with progressive disclosure for deeper functionality. Designers see their active projects and submission status. Engineers see pending reviews with change summaries. VPs see a decision queue with recommendations.

Approval Flow Redesign

The approval workflow moves from a buried settings menu to a persistent status bar visible on every screen. Pending approvals surface with context — what changed, who's waiting, what's the deadline. One-click approval from the notification itself, including a mobile-optimized approval card for VP users.

Version Comparison Engine

A side-by-side diff view for design iterations with overlay, slider, and annotation tools. Engineers can review changes visually without opening separate files. Change summaries auto-generate from revision metadata, highlighting what's new, modified, or removed.

Smart Notifications

Notification triage engine with three priority tiers: action required (you're blocking someone), FYI (relevant to your work), and archive (system updates). Digest mode available for non-urgent items. Critical notifications persist until acknowledged.

Streamlined Upload

Upload reduced from 7 fields to 2 — file and project. All other metadata auto-populates from project context, file type detection, and user history. Drag-and-drop from file system or direct integration with design tool plugins.

The goal isn't more features. It's fewer decisions required to accomplish the task.
§ 06

Prioritized Roadmap

Scope discipline is the hardest part of any redesign. The roadmap is structured around impact on the core adoption metric — seat utilization — with each phase building on validated outcomes from the previous one.

V1
Weeks 1–8
  • Role-based dashboards — three distinct entry experiences replacing the universal view.
  • Approval status bar — persistent, visible on every screen, one-click approve.
  • Upload simplification — reduce to 2 fields with auto-population.
  • Notification priority tiers — action required vs. FYI vs. archive.
V2
Weeks 9–16
  • Version comparison engine — side-by-side diff with overlay and annotations.
  • Mobile approval cards — responsive approval flow for VP users.
  • Design tool plugins — direct submission from Solidworks, Figma, Adobe CC.
V3
Weeks 17–24
  • Smart notification digests — ML-driven priority scoring based on user behavior.
  • Automated change summaries — AI-generated revision notes from file diffs.
  • Analytics dashboard — review cycle time, bottleneck identification, team velocity.
§ 07

Projected Impact

Targets are based on benchmark data from comparable enterprise UX redesigns and validated against the specific friction points identified in discovery research.

41% →
68%
Seat Utilization
Primary adoption metric. Role-based views eliminate the "I don't know where to start" dropout.
5+ clicks →
1 click
Approval Access
Persistent status bar and one-click approval eliminate the buried workflow problem.
12 min →
2 min
Upload Time
Field reduction and auto-population remove the file management tax on every review cycle.
48+ hrs →
<8 hrs
Review Response
Priority notifications and mobile approvals compress the review bottleneck.
5/6 →
1/6
Shadow Workflows
Clients using informal workarounds to route around Forge. Target: make the tool faster than the workaround.
2.1 →
4.2
NPS Score (internal)
User satisfaction measured quarterly across all three persona types.
§ 08

The Product Worked. The Experience Didn't.

Forge's adoption failure wasn't a technology problem — it was a design problem. The platform had the right capabilities but the wrong experience. Users didn't need more features; they needed fewer barriers between intent and action.

The redesign applies the same principle that governs great industrial design: every element earns its place, every interaction serves a purpose, and the system respects how people actually work rather than how an architect imagined they would.

The most expensive feature in any enterprise product is the one that makes users avoid the product entirely.

Adoption is a design metric. If people aren't using it, the design is wrong — regardless of what the feature list says.