Selected Work
SysAid SysAid

Brought SysAid's first self-serve Advanced trial to a traditionally sales-led product

License Manager Advanced is SysAid's SaaS spend intelligence module. Getting IT teams to evaluate it without a sales conversation meant designing an entirely new kind of first impression, one that showed the problem clearly enough to create urgency, without giving the solution away for free.

B2B SaaS Product-Led Sales License Manager Cross-functional leadership
SysAid License Manager product interface
Context

Why this project existed

SysAid is an enterprise ITSM platform that has grown entirely through direct sales for its whole history. Every new customer started with a sales conversation, a demo, and a procurement cycle. The product had never had to make a first impression on its own.

License Manager Advanced is a module within SysAid that discovers SaaS applications running across an organization and surfaces cost savings. The business wanted to introduce a PLS motion as a first step toward full PLG: a free 14-day trial that lets prospects evaluate the product autonomously, get to value on their own terms, and then make a qualified, informed call to sales. The immediate goal was self-serve evaluation, not self-serve conversion — but the longer-term direction is a fully product-led growth model.

The design problem was unusual: you can't interview trial users before they've used the product. There's no discovery phase with them. You design, you launch, and you watch what happens in the recordings.

1st
First self-serve trial in SysAid's history, with no prior baseline to design from
PLS
Product-Led Sales: trial generates qualified leads for sales, not self-serve conversions
Live
Post-launch measurement via session recordings, as traditional user research wasn't possible
My role

My design challenge

I led design across three parallel workstreams: the trial start experience, the in-trial limitation UI, and the trial end and expiry experience. The core tension in all three was the same: the product's value is discovery. Finding SaaS applications your organization didn't know it was running. The trial had to demonstrate that value clearly without giving it away fully. Show enough to create urgency; not so much that there's no reason to upgrade.

The harder part of the job wasn't the UI. Sales needed to believe the trial wouldn't bypass them: it would generate better-qualified leads, not self-serve conversions that skipped the conversation entirely. That required showing the sales team the design logic: the trial reveals the problem, but sales closes the solution. Aligning marketing on who the trial was targeting, and what message prospects would arrive with, was a major workstream alongside the product design itself.

Design workstreams

Four flows, designed in parallel

The trial experience was built across four parallel workstreams — not sequential iterations. Pre-trial, onboarding, in-trial, and post-trial each had a distinct design challenge, but all four were anchored to the same central insight: make the limitation visible, make the value undeniable, make the path to sales feel like the natural next step.

01 Pre-Trial
02 Onboarding
03 In-Trial
04 Post-Trial
Product
  • Trial start paywall
  • Product intro
  • Scan setup
  • In-product guidance
  • App list (limited)
  • App detail view
  • Trial expiry paywall
Marketing & Sales
  • Audience targeting
  • Welcome email
  • Mid-trial check-in
  • Expiry reminder
  • Sales outreach
  • Lead qualification
Workstream 01

Trial start experience

Two screens make up the trial start experience. The first is a within-product marketing page that prospects see before committing to the trial. It shows the full value proposition: cost savings, compliance visibility, shadow IT detection, with a scrollable preview of the actual product. The decision to start the trial is informed, not blind. The second screen is a 3-step onboarding wizard that runs immediately after clicking "Start trial now." Step 1 asks how the user wants to discover their applications: connect Microsoft Entra ID (the recommended path, which automatically surfaces the top 20 apps in minutes), deploy a browser extension, or add apps manually. The goal was to get users to their first real data as fast as possible, without tutorials or forced configuration.

License Manager Advanced trial entry page with product value preview

The trial entry page. Before starting the 14-day trial, prospects see the full value proposition alongside a live preview of the product. The decision to start is informed before they've clicked anything.

Onboarding step 1: Discover your applications Onboarding step 2: Connect Microsoft Entra ID Onboarding step 3: Enable Advanced Insights

Step 1: Discover your applications. Choose how to connect: Microsoft Entra ID (recommended, finds top 20 apps automatically), a browser extension, or manual entry. Skipping is always an option.

Step 2: Connect Microsoft Entra ID. A focused 2-step guide: read the setup documentation, then connect the add-on in configuration settings. The right panel shows a preview of what the data will look like.

Step 3: Enable Advanced Insights. Read-only Entra ID access lets SysAid calculate savings from usage and seat counts. Privacy-first messaging ("we never modify your settings") is built directly into the step.

Workstream 02

In-trial limitation UI

The core design decision of the entire project. Once inside the trial, users see the full applications table with rich data: total estimated savings, newly discovered apps, utilization rates. A persistent trial countdown banner at the top reminds them of the 14-day window. But the real conversion moment is at the bottom of the table. After scrolling through their 20 visible apps, users hit a banner: "241 applications discovered. You're currently viewing the top 20. Upgrade to explore your full application ecosystem and uncover hidden savings." It's not a popup or an interrupt. It appears naturally as a scroll reveal at the end of the list, at the exact moment the user has already seen the value and is looking for more. The placement was deliberate: earn the user's attention first, then show them what they're missing.

License Manager in-trial state: applications table with 14-day countdown banner

The in-trial state. A full applications table with estimated savings, utilization rates, and discovery data visible from day one. The 14-day countdown banner at the top maintains gentle urgency without blocking the experience.

241 applications discovered scroll reveal at the bottom of the apps table

The scroll reveal. At the bottom of the table, after the user has explored their 20 visible apps, the full scan count appears: 241 applications discovered. The placement is intentional. Earn the user's attention first, then show them what they're missing.

Workstream 03

Trial end and expiry experience

When the 14-day trial ends, the user sees a single page: "Your free trial is complete. Upgrade to License Manager Advanced for full access to: Manage your 241 discovered applications. Track usage and reclaim licenses. Ensure license compliance. Detect Shadow IT." Then: "Ready to continue?" with a "Request pricing" button. Below that, a quiet link to share trial feedback. The design challenge was making this feel like a natural handoff, not a wall. The specific number (241 applications) is pulled from the user's own scan, so it's personal. The list of locked features is exactly what they explored during the trial. The ask is clear but not coercive, because the user already knows what they're missing.

Trial complete page: Your free trial is complete, upgrade to License Manager Advanced

The trial end page. The user's own discovery results (241 applications) are surfaced in the upgrade ask. The list of locked features maps directly to what they explored. "Request pricing" is the only hard CTA; a feedback link gives users a low-friction exit that still generates signal.

In-product guidance

Using UserFlow to drive adoption

Getting users to connect their environment is step one. Getting them to enrich the data (adding license counts, cost information, and reviewing discovered apps) is what actually makes the product valuable during the trial window. We used UserFlow to bridge that gap without requiring a support rep or onboarding call.

Once the application scan completes and the user lands on the Applications table, UserFlow triggers a contextual tooltip: "Your applications are ready. Discover how to manage your applications and optimize costs with a brief 1-minute walkthrough." The user can accept or dismiss. If they accept, a guided side panel walks them through the remaining enrichment steps in sequence, with contextual tooltips pointing to the exact fields they need to fill in.

The design logic: the more data a user puts into License Manager during the trial, the more accurate their savings estimate, and the stronger the upgrade case. UserFlow was the mechanism for nudging users toward that enrichment at the right moment, with the right framing, without interrupting the core experience.

UserFlow tooltip: Your applications are ready, start 1-minute walkthrough

The walkthrough trigger. After the scan completes, UserFlow surfaces a tooltip showing utilization data and estimated savings already available. The ask is low-friction: a 1-minute walkthrough, with an easy dismiss if the user prefers to explore on their own.

UserFlow guided side panel: Set up applications with step-by-step checklist

The guided setup panel. A right-side panel walks through enrichment steps in order: connect Entra ID, add total licenses, add total cost, create custom applications. Contextual tooltips point to the exact column or field in the table, so users never lose their place in the product.

Research approach

Watching instead of asking

Traditional user research requires access to users before they experience the product: interviews, usability tests, observation sessions. For a trial, that's impossible. The trial IS the first experience. There's no one to interview before it exists.

The primary research tool was session recordings: watching real trial users navigate the product after launch. Where they hesitated. Where they dropped off. What they actually clicked versus what the design expected them to click. Every recording session fed directly into the next design decision.

This constraint shaped the design philosophy. Build something that works before you fully understand the user, then iterate fast on what you see. The post-launch feedback cycle was part of the design process from the start, not an afterthought added once the product was live.

Design
Launch
Watch recordings
Identify friction
Iterate
Outcome

What launched, and what we're measuring

The trial launched in early 2025, marking SysAid's first self-serve product experience in the company's history. Results are early and the measurement infrastructure is in place. The metrics below are what we're actively tracking; numbers will be updated as the data matures.

[XX%] Trial start conversion: of users who reached the trial entry point, % who completed setup and reached the applications list.

[XX%] Paywall engagement: of trial users who saw the "241 apps discovered" banner, % who clicked to learn about upgrading.

[XX] Sales-qualified leads: trial users who requested a sales conversation after seeing their discovery results.

Session recording insights: post-launch recordings identified specific friction points in the flow that are feeding directly into the next iteration cycle.

License Manager in-trial applications table with trial countdown banner

The shipped trial experience. The full applications table with estimated savings, utilization data, and discovery information visible from day one. The trial countdown banner maintains urgency without blocking the product.

Next project

groundcover

Simplified how DevOps teams control data ingestion in a complex observability product. Shipped in 6 weeks across three rapid design iterations.

Read the case study

Contact

Say hello.

I'm always open to thoughtful conversations about product design, complex B2B SaaS, AI, and new opportunities.