Build a demo app called "Replaydar.ai" — an AI-powered session replay analysis platform that automatically watches every session recording, detects bugs and UX issues, and surfaces actionable fixes with severity scores and plain-English descriptions. Instead of teams manually scrubbing through thousands of unwatched recordings in tools like Hotjar, FullStory, or LogRocket, Replaydar's AI watches every single session, identifies rage clicks, broken flows, error states, drop-off patterns, and confusing UX — then presents a prioritised issue list with specific, actionable fixes. Over time it learns what "normal" looks like for your app and alerts on regressions.

WHO: Product managers, UX designers, and engineering leads at B2B SaaS companies who have session replay tools but never have time to watch the recordings.
WHAT: An AI that watches your session replays so you don't have to — and tells you exactly what's broken and how to fix it.
WHY: Teams record thousands of sessions but watch less than 1%. Bugs and UX friction go undetected for months. Replaydar turns session replays from a passive archive into an active bug-detection and UX improvement engine.

This is a DEMO app. Use hardcoded dummy data throughout. No real authentication — clicking "Enter Demo" or "Try the Dashboard" on the landing page navigates directly into the app dashboard. No login screen, no signup forms that actually validate. The simulated product being analysed inside the dashboard is a B2B SaaS project management tool called "Planwise" — a fictional mid-market PM app with boards, timelines, tasks, and team collaboration features.

---

VISUAL STYLE:

Apply a clean light-mode SaaS dashboard aesthetic throughout — inspired by Linear's light mode, Notion, and Vercel's dashboard. Background: off-white (#FAFAFA). Card surfaces: white (#FFFFFF) with subtle 1px borders (#E5E7EB). Primary accent color: indigo (#6366F1). Secondary accent: emerald (#10B981) for success states. Tertiary accent: amber (#F59E0B) for warning/medium-severity states. Error/critical: rose (#EF4444). Info accent: sky blue (#0EA5E9). Text: near-black (#111827) for headings, medium gray (#6B7280) for secondary text. Use the Inter font family throughout. Cards should have clean white surfaces with subtle box-shadow (0 1px 3px rgba(0,0,0,0.08), 0 1px 2px rgba(0,0,0,0.04)) instead of glassmorphism. Rounded corners (12px) on all cards and containers. Generous spacing — 24px padding inside cards, 16px gap between grid items. Severity badges use semantic colors: rose (#EF4444) for Critical, amber (#F59E0B) for High, sky blue (#0EA5E9) for Medium, slate (#94A3B8) for Low. Use a subtle indigo-to-violet gradient (#6366F1 to #8B5CF6) on hero sections, key CTAs, and primary action buttons. Hover effects on cards should add a slightly stronger shadow and a faint indigo border. Buttons have a slight scale(1.02) on hover. Integration logos (Hotjar, FullStory, LogRocket, Clarity, PostHog) should be shown as small square icons with subtle borders — connected ones have an emerald dot indicator, disconnected ones are slightly faded.

LOGO: The Replaydar.ai logo should be the text "replaydar" in lowercase, semi-bold Inter font, near-black (#111827) — with ".ai" in indigo (#6366F1). A small radar/pulse icon (two concentric quarter-arcs, like a WiFi or radar symbol rotated to point right) in indigo should sit before the text. Use this in the nav, sidebar, and footer.

---

PAGES AND ROUTES:

| Route              | Page                        |
|--------------------|-----------------------------|
| /                  | Landing Page (marketing)    |
| /app/dashboard     | Dashboard                   |
| /app/issues        | Issues                      |
| /app/sessions      | Sessions                    |
| /app/insights      | Insights & Trends           |

The landing page (/) has its own layout — no sidebar, just a sticky nav bar.

All /app/* pages share an app layout with a fixed left sidebar (hidden on mobile, becomes a slide-in overlay).

---

PAGE 1: LANDING PAGE (route: "/")

Fixed Navigation Bar:
- White background with subtle bottom border (#E5E7EB), backdrop blur on scroll
- Logo on the left
- Nav links center: "Features", "How It Works", "Integrations", "Testimonials" (smooth scroll to sections)
- "Enter Demo" button on the right (links to /app/dashboard) — indigo-to-violet gradient background, white text, rounded-lg
- Mobile: hamburger menu that opens a dropdown with the same links

Hero Section:
- Full viewport height, centered content
- Background: white with a subtle radial gradient glow of indigo at ~6% opacity behind the headline area, plus a very faint grid pattern overlay (like graph paper at 3% opacity)
- Eyebrow text above headline: "AI-POWERED SESSION REPLAY ANALYSIS" in small caps, indigo color, letter-spacing 3px, font-medium
- Large headline: "Your session replays are full of answers. Nobody's watching them." — text-4xl on mobile, text-6xl on desktop, bold, tight tracking, near-black
- Subheadline in medium gray (max-w-2xl): "Replaydar's AI watches every session recording from Hotjar, FullStory, LogRocket, Clarity, or PostHog — detects bugs, UX friction, and drop-off patterns — and surfaces prioritised, actionable fixes in plain English."
- Two CTA buttons side by side:
  - "Enter Demo" (primary, indigo-to-violet gradient background, white text) — navigates to /app/dashboard
  - "See How It Works" (secondary, white background with indigo text and indigo/10 border) — scrolls down to How It Works section
- Below buttons, small muted text: "Watching 2.4M+ sessions across 340+ teams  ·  Avg 47 issues found per 1,000 sessions  ·  83% fix rate within 7 days"
- Floating stats card below hero text: white card with subtle shadow, 4 stats in a row: "< 1%" (rose, "Sessions Watched Manually"), "47" (indigo, "Avg Issues per 1K Sessions"), "83%" (emerald, "Fix Rate in 7 Days"), "4.2hrs" (amber, "Avg Time to First Fix")
- Integration logos row below the stats card: small grayscale logos for Hotjar, FullStory, LogRocket, Clarity, PostHog with label "Works with your existing replay tools" in muted text
- Use framer-motion staggered fade-in animations (initial opacity:0, y:20)

How It Works Section (id="how-it-works"):
- Section heading: "From unwatched recordings to fixed bugs in three steps" + subtitle "No workflow changes. No tagging. Just connect and go."
- 3 white cards in a row with subtle shadows, each with:
  - Step number in a small indigo circle ("1", "2", "3")
  - Coloured icon in a rounded square (icon background at 8% opacity of the icon colour)
  - Step 1: Plug icon (indigo) — "Connect Your Replay Tool" — "Link your Hotjar, FullStory, LogRocket, Clarity, or PostHog account in one click. Replaydar starts ingesting your session recordings immediately — no SDK, no code changes."
  - Step 2: Eye icon (violet) — "AI Watches Every Session" — "Our AI reviews every recorded session in real-time. It detects rage clicks, dead clicks, broken flows, JavaScript errors, confusing navigation patterns, form abandonment, and drop-off points — across every page and every user segment."
  - Step 3: Wrench icon (emerald) — "Get Prioritised Fixes" — "Issues surface in a prioritised list with severity scores, affected user counts, plain-English descriptions, and specific, actionable fix suggestions. Your team fixes what matters most, fast."
- Animate in on scroll with stagger

Features Section (id="features"):
- Heading: "Everything your replay tool should have told you" + subtitle "Replaydar turns passive recordings into an active QA and UX improvement engine."
- 2x2 grid of feature cards (white cards with subtle shadow):
  - Card 1: Bug icon (rose) — "Automated Bug Detection" — "Catches JavaScript errors, broken UI states, failed API calls, and rendering glitches that users encounter but never report. Every bug includes the exact session, timestamp, and browser context."
  - Card 2: MousePointerClick icon (amber) — "Rage Click & Dead Click Detection" — "Identifies where users click repeatedly in frustration or click on elements that aren't interactive. Surfaces the exact elements, pages, and frequency with suggested fixes."
  - Card 3: Route icon (indigo) — "Broken Flow Analysis" — "Maps expected user flows (onboarding, checkout, feature adoption) and detects where users deviate, loop, or abandon. Shows exactly where the flow breaks and why."
  - Card 4: TrendingUp icon (emerald) — "Regression Alerts" — "Learns what 'normal' looks like for your app. When a new deploy causes rage clicks to spike or a flow's completion rate drops, Replaydar alerts you before your users complain."

Integrations Section (id="integrations"):
- Heading: "Connects to the tools you already use" + subtitle "One-click setup. No SDK changes."
- 5 integration cards in a row (white with subtle border), each showing:
  - Integration logo placeholder (a coloured square icon)
  - Name: Hotjar, FullStory, LogRocket, Clarity, PostHog
  - Small description of what data is pulled
  - Hotjar: "Session recordings, heatmaps, and user feedback"
  - FullStory: "Full session data with DX data layer events"
  - LogRocket: "Sessions, console logs, and network requests"
  - Clarity: "Session recordings and behavioural analytics"
  - PostHog: "Session replays, feature flags, and event data"
- On mobile, horizontally scrollable row

Testimonials Section (id="testimonials"):
- Heading: "Teams ship better products with Replaydar" + subtitle
- 3 white testimonial cards with subtle shadow:
  - Quote icon at top (indigo at 20% opacity)
  - Testimonial 1: "We had 14,000 sessions recorded last month and watched maybe 30. In its first week, Replaydar found a broken drag-and-drop interaction that had been frustrating users for months. We fixed it in a day." — Megan Torres, Head of Product, Stackline
  - Testimonial 2: "The AI descriptions are shockingly good. It doesn't just say 'rage click detected' — it says 'Users are rage-clicking the Save button on the project settings page because the button enters a disabled state after the first click and never re-enables.' That's actionable." — David Okonkwo, Engineering Lead, Driftboard
  - Testimonial 3: "We used to do quarterly UX audits by watching sessions for a week straight. Now Replaydar does it continuously. Our UX issue backlog went from vague hunches to a prioritised, evidence-backed list." — Anya Petrov, UX Lead, Runbook

CTA Section:
- White card with indigo/5 background tint, subtle indigo border, large padding
- "See Replaydar analyse a real product"
- Subtext: "Explore the demo dashboard for Planwise, a fictional project management tool, and see how AI session replay analysis surfaces bugs and UX issues you'd never find manually."
- "Enter Demo Dashboard" button (indigo-to-violet gradient) — navigates to /app/dashboard

Footer:
- Border-top (#E5E7EB), white background
- Logo on left
- Links center: Features, How It Works, Integrations, Pricing, Privacy, Terms (all # hrefs)
- Tagline: "AI-powered session replay analysis. Find what's broken. Fix what matters."
- Copyright: "2026 Replaydar.ai. All rights reserved."

---

APP LAYOUT (shared by all /app/* pages):

Sidebar (Desktop):
- Fixed, left side, 240px wide, full height
- Background: white (#FFFFFF), right border (#E5E7EB)
- Logo at top with left padding
- Navigation items with Lucide icons, rounded-lg hover/active states:
  - Dashboard (LayoutDashboard icon) → /app/dashboard
  - Issues (AlertCircle icon) → /app/issues
  - Sessions (PlayCircle icon) → /app/sessions
  - Insights (LineChart icon) → /app/insights
- Active item: indigo left border (3px), bg-indigo/5 (#EEF2FF), text-indigo-700
- Inactive: text-gray-600 hover:bg-gray-50
- Divider line above bottom section
- "Connected Source" info at bottom: Small card showing integration status — green dot + "FullStory" as the connected source, "Planwise Production" as the project name, small "2 of 5 connected" text in muted gray
- Below that: 5 tiny integration icons in a row (Hotjar, FullStory, LogRocket, Clarity, PostHog) — FullStory and PostHog have green dot indicators (connected), the other three are faded (not connected)

Sidebar (Mobile):
- Hidden by default
- Mobile header bar: fixed top, white background, bottom border, hamburger button + logo
- Hamburger opens a slide-in overlay (light backdrop blur + sidebar from left)
- Close button in the overlay

Main Content Area:
- Left padding for sidebar on desktop (pl-64), top padding for mobile header on mobile (pt-14)
- Content max-width: max-w-7xl, centered, with padding
- Background: #FAFAFA

---

PAGE 2: DASHBOARD (route: "/app/dashboard")

Header:
- "Planwise — Session Analysis Overview" (h1, bold, near-black)
- "Showing data from the last 30 days. 8,429 sessions analysed." (muted text)
- Small row of connected source badges to the right: green-dotted "FullStory" pill and green-dotted "PostHog" pill

4 Key Metric Cards (responsive grid: 1→2→4 columns):
Each card is a white card with subtle shadow:
- Label (muted, small)
- Icon in a coloured rounded square (top right)
- Large number
- Small change indicator with up/down arrow

Cards:
1. "Issues Detected" — "142" — AlertCircle icon (rose bg at 10%) — "↑ 18 new this week"
2. "Sessions Analysed" — "8,429" — PlayCircle icon (indigo bg at 10%) — "100% coverage — 0 unwatched"
3. "Avg. Severity Score" — "6.4 / 10" — Gauge icon (amber bg at 10%) — "↓ 0.3 from last month (improving)"
4. "Fix Rate (7-day)" — "83%" — CheckCircle icon (emerald bg at 10%) — "↑ 12% vs last month"

Issue Severity Breakdown (full width white card):
- Title: "Issue Distribution by Severity"
- A horizontal stacked bar showing the breakdown:
  - Critical (rose): 12 issues (8%)
  - High (amber): 34 issues (24%)
  - Medium (sky blue): 58 issues (41%)
  - Low (slate): 38 issues (27%)
- Below the bar, 4 inline stat pills showing each severity with count and colour dot

Main Content Grid (3/5 + 2/5 on desktop):

Left: Recent Issues Feed (white card):
- Header: "Recently Detected Issues" with "View All →" link to /app/issues
- 7 issue items, each with severity dot, title, affected page, and timestamp:
  1. Rose dot (Critical) — "Save button disabled permanently after first click on Project Settings" — "/settings/project" — "Affects 23% of users who visit this page" — 2 hours ago
  2. Rose dot (Critical) — "Drag-and-drop on Kanban board fails silently in Firefox" — "/board/:id" — "Affects all Firefox users (11% of traffic)" — 5 hours ago
  3. Amber dot (High) — "Rage clicks on 'Invite Team Member' button — no loading indicator" — "/settings/team" — "312 rage click events this week" — 8 hours ago
  4. Amber dot (High) — "Onboarding wizard 'Next' button unresponsive on Step 3" — "/onboarding" — "41% of new users drop off at this step" — 12 hours ago
  5. Sky dot (Medium) — "Users repeatedly toggling between List and Board view on the Tasks page" — "/tasks" — "Suggests confusion about default view" — 1 day ago
  6. Sky dot (Medium) — "Form abandonment on 'Create Project' — 34% abandon after filling name field" — "/projects/new" — "Most common exit point: template selection" — 1 day ago
  7. Slate dot (Low) — "Tooltip on timeline zoom control clips off-screen on small viewports" — "/timeline" — "Cosmetic issue, 4 sessions observed" — 2 days ago

Right: Quick Actions (white card):
- Header: "Quick Actions"
- 4 action cards stacked:
  1. AlertCircle icon (rose) — "Critical Issues" — "12 critical issues need attention" — ArrowRight, links to /app/issues
  2. PlayCircle icon (indigo) — "Browse Sessions" — "8,429 analysed this month" — ArrowRight, links to /app/sessions
  3. LineChart icon (emerald) — "View Trends" — "Rage clicks down 18% this week" — ArrowRight, links to /app/insights
  4. Plug icon (sky) — "Manage Integrations" — "2 of 5 sources connected" — ArrowRight (non-functional)

AI Summary Card (full width white card, subtle indigo left border):
- Small Sparkles icon (indigo) + "AI Weekly Summary"
- Text block: "This week, Replaydar analysed 2,147 sessions across Planwise. The most impactful issue discovered is the permanently disabled Save button on the Project Settings page — it silently fails after the first click due to a missing state reset, affecting 23% of visitors to that page. The broken Firefox drag-and-drop on the Kanban board is the second most critical issue, impacting all Firefox users (11% of total traffic). Good news: rage clicks on the Team Invite button are down 34% since last week's fix to add a loading spinner. Three issues from last week were verified as resolved. Recommendation: prioritise the Settings save button fix — estimated 340 affected users this week alone."

---

PAGE 3: ISSUES (route: "/app/issues")

Header:
- "Detected Issues" (h1) + "AI-detected bugs, UX friction, and broken flows across Planwise. Sorted by severity and impact." (muted)

Filter Bar (horizontal row, white card):
- Severity dropdown: All / Critical / High / Medium / Low
- Issue Type dropdown: All / Bug / Rage Click / Dead Click / Broken Flow / Form Abandonment / Error State / Drop-off
- Page dropdown: All / /board/:id / /settings/project / /settings/team / /onboarding / /tasks / /projects/new / /timeline / /dashboard
- Status dropdown: All / Open / In Progress / Resolved / Ignored
- Sort: Severity (default) / Most Affected Users / Newest / Oldest
All filtering is client-side using state.

Issue Count Bar:
- "Showing 142 issues" — with colour-coded severity counts: "12 Critical · 34 High · 58 Medium · 38 Low"

Issue Cards (list of white cards, detailed):

Issue 1:
- Severity: Rose badge "CRITICAL" — Score: "9.2 / 10"
- Type badge: "Bug" (rose outline)
- Title: "Save button becomes permanently disabled after first click on Project Settings page"
- Page: "/settings/project" — Browser: "All browsers"
- First detected: "Feb 18, 2026" — Sessions affected: "847 of 3,691 (23%)"
- AI Description (in a light gray-bg inset card): "When a user clicks the Save button on the Project Settings page, the button enters a disabled state (opacity: 0.5, pointer-events: none) to prevent double submission. However, the disabled state is never removed after the save request completes — whether it succeeds or fails. Users who need to make a second change on the same page visit cannot save. Most users try clicking the disabled button 3-5 times before navigating away. 68% of affected users do not return to complete their settings change."
- Suggested Fix (in a light indigo-bg inset card with wrench icon): "Re-enable the Save button in both the .then() and .catch() handlers of the save API call. Add a loading spinner to the button during the request, then restore it to its default enabled state on completion. Consider using a form-dirty state to disable the button only when no changes have been made."
- Evidence: "847 sessions · 3,218 rage click events on this button · avg 4.1 clicks per frustrated user"
- Status: white badge with rose dot "Open"

Issue 2:
- Severity: Rose badge "CRITICAL" — Score: "8.8 / 10"
- Type badge: "Bug" (rose outline)
- Title: "Kanban board drag-and-drop fails silently in Firefox — tasks snap back to original position"
- Page: "/board/:id" — Browser: "Firefox only (11% of traffic)"
- First detected: "Feb 12, 2026" — Sessions affected: "312 of 2,847 Firefox sessions (100%)"
- AI Description: "In Firefox, dragging a task card on the Kanban board triggers the drag start animation, but when the card is dropped in a new column, it snaps back to its original position. No error message is shown to the user. The drop event appears to fire but the state update does not persist. Users typically attempt 3-4 drags before switching to the list view or giving up. This is likely related to a Firefox-specific pointer event handling difference in the drag library being used."
- Suggested Fix: "Check the drag-and-drop library version for known Firefox compatibility issues. The likely culprit is the dataTransfer.setData() call — Firefox requires a specific MIME type argument. Alternatively, consider switching from the HTML5 drag API to a pointer-event-based library like @dnd-kit which handles cross-browser consistency."
- Evidence: "312 sessions · All Firefox users affected · 1,247 failed drag attempts · avg 3.8 retries per user"
- Status: white badge with rose dot "Open"

Issue 3:
- Severity: Amber badge "HIGH" — Score: "7.6 / 10"
- Type badge: "Rage Click" (amber outline)
- Title: "Rage clicks on 'Invite Team Member' button — no loading state or feedback"
- Page: "/settings/team" — Browser: "All browsers"
- First detected: "Feb 8, 2026" — Sessions affected: "289 of 1,204 (24%)"
- AI Description: "When users click the 'Invite Team Member' button, the invitation API call takes 2-4 seconds but the button provides no visual feedback — no spinner, no disabled state, no text change. Users assume the click didn't register and click again, sometimes 5-8 times, triggering duplicate invitation emails. In 34 sessions, users received an error toast saying 'Invitation already sent' which further confused them."
- Suggested Fix: "Add an immediate loading spinner or 'Sending...' text change to the button on click. Disable the button during the API call to prevent duplicate submissions. Show a clear success state ('Invitation sent!') with a checkmark on completion. Consider debouncing the click handler as an additional safeguard."
- Evidence: "312 rage click events this week · 34 duplicate invitation errors · avg 5.2 clicks per frustrated user"
- Status: white badge with amber dot "In Progress"

Issue 4:
- Severity: Amber badge "HIGH" — Score: "7.4 / 10"
- Type badge: "Broken Flow" (amber outline)
- Title: "Onboarding wizard — 41% of new users drop off at Step 3 (Team Setup)"
- Page: "/onboarding" — Browser: "All browsers"
- First detected: "Feb 3, 2026" — Sessions affected: "178 of 434 new user sessions (41%)"
- AI Description: "The onboarding wizard has 5 steps. Steps 1 (Account) and 2 (Create Project) have near-100% completion. At Step 3 (Team Setup), 41% of users abandon. The AI observed two patterns: (1) 62% of abandoners are solo users who don't have team members to invite yet — there is no visible 'Skip' option, only a muted text link that says 'I'll do this later' at the very bottom of the page below the fold. (2) 38% of abandoners enter an email address and click 'Next' without clicking 'Add Member' first, causing a validation error that says 'Please add at least one team member' — contradicting the existence of the skip option."
- Suggested Fix: "Make the 'Skip this step' option a prominent secondary button at the top of the step, not a hidden text link below the fold. Change the validation logic: if the email field has content and the user clicks Next, auto-add the team member rather than showing an error. Alternatively, allow proceeding with zero team members by making the step optional."
- Evidence: "178 drop-offs at Step 3 · 67 users scrolled but missed the skip link · 74 users hit the validation error · only 12% of those who hit the error recovered and completed onboarding"
- Status: white badge with amber dot "Open"

Issue 5:
- Severity: Amber badge "HIGH" — Score: "7.1 / 10"
- Type badge: "Form Abandonment" (amber outline)
- Title: "34% of users abandon 'Create Project' form after filling project name"
- Page: "/projects/new" — Browser: "All browsers"
- First detected: "Feb 5, 2026" — Sessions affected: "221 of 650 (34%)"
- AI Description: "Users fill in the project name field but abandon the form when they reach the template selection step. The template picker shows 12 templates in a 4x3 grid with small preview thumbnails and short names, but no descriptions of what each template includes. Users spend an average of 47 seconds scanning templates, then 28% click a template, look at the preview, click back, try another template, and eventually leave the page. The remaining 6% scroll past the templates looking for a 'blank project' option that doesn't exist."
- Suggested Fix: "Add a 'Blank Project' option as the first template in the grid — prominently styled. Add short (1-line) descriptions below each template name explaining what's included (e.g., 'Kanban board with sprint columns and backlog'). Consider adding a hover-preview or side panel that shows template details without navigating away. Reduce the initial template options to 6 most popular, with a 'Show all templates' expandable section."
- Evidence: "221 abandoned sessions · avg 47 sec on template picker before abandoning · 182 template click-then-back events · 39 users searched for 'blank' in the page"
- Status: white badge with amber dot "Open"

Issue 6:
- Severity: Sky badge "MEDIUM" — Score: "5.8 / 10"
- Type badge: "UX Confusion" (sky outline)
- Title: "Users repeatedly toggle between List and Board views on the Tasks page — unclear default"
- Page: "/tasks" — Browser: "All browsers"
- First detected: "Feb 10, 2026" — Sessions affected: "394 of 4,218 (9%)"
- AI Description: "394 users switched between List view and Board view more than 3 times in a single session on the Tasks page. The page defaults to Board view, but 61% of these users ultimately settled on List view. This suggests the default doesn't match user preference for a significant segment. Additionally, the view toggle buttons (two small icons in the top-right) have no labels and no tooltip, making them easy to miss — 22% of new users never discovered the toggle at all."
- Suggested Fix: "Add text labels next to the view toggle icons ('List' and 'Board'). Consider remembering each user's last-used view preference in localStorage. Add a tooltip on first visit explaining the two views. Consider making the default view configurable in user settings."
- Evidence: "394 sessions with 3+ view switches · avg 4.7 toggles per session · 61% settled on List view"
- Status: white badge with sky dot "Open"

Issue 7:
- Severity: Sky badge "MEDIUM" — Score: "5.5 / 10"
- Type badge: "Dead Click" (sky outline)
- Title: "Dead clicks on task status text in List view — users expect inline editing"
- Page: "/tasks" — Browser: "All browsers"
- First detected: "Feb 14, 2026" — Sessions affected: "267 of 4,218 (6%)"
- AI Description: "In the task list view, users click on the status label text ('To Do', 'In Progress', 'Done') expecting a dropdown to appear for inline status changes. The status text is styled with a coloured badge that looks interactive but is not clickable — users must right-click or open the task detail modal to change status. 267 users clicked directly on the status badge with no result."
- Suggested Fix: "Make the status badge clickable in the list view with an inline dropdown to change status. Use cursor:pointer and a subtle hover effect to signal interactivity. This is a common pattern in tools like Linear and Asana that users expect."
- Evidence: "267 sessions · 1,034 dead clicks on status badges · users who found the right-click menu took avg 12 seconds to discover it"
- Status: white badge with sky dot "Open"

Issue 8:
- Severity: Sky badge "MEDIUM" — Score: "5.2 / 10"
- Type badge: "Error State" (sky outline)
- Title: "Timeline view throws uncaught TypeError on projects with no due dates set"
- Page: "/timeline" — Browser: "All browsers"
- First detected: "Feb 16, 2026" — Sessions affected: "89 of 1,547 (6%)"
- AI Description: "When a user navigates to the Timeline view and their project contains tasks without due dates, a JavaScript TypeError ('Cannot read properties of null, reading getTime') causes the timeline to render as a blank white screen. No error message is shown to the user. 89 users encountered this blank screen, and 71 of them immediately navigated away. The error occurs in the date-sorting function that assumes all tasks have a dueDate property."
- Suggested Fix: "Add a null check for task.dueDate before calling .getTime(). Tasks without due dates should either be excluded from the timeline with a visible message ('3 tasks without due dates are not shown — add dates to see them here'), or placed in an 'Unscheduled' section at the bottom of the timeline. Add an error boundary to the Timeline component to show a helpful fallback UI instead of a blank screen."
- Evidence: "89 sessions · 71 immediate bounces · TypeError in timeline-sort.js:47"
- Status: white badge with sky dot "Open"

Issue 9:
- Severity: Slate badge "LOW" — Score: "3.4 / 10"
- Type badge: "Cosmetic" (slate outline)
- Title: "Tooltip on timeline zoom slider clips off-screen on viewports under 1200px"
- Page: "/timeline" — Browser: "Chrome, Edge"
- First detected: "Feb 19, 2026" — Sessions affected: "43 of 1,547 (3%)"
- AI Description: "The zoom slider in the top-right corner of the Timeline view has a tooltip that appears above the slider. On viewports narrower than 1200px, the tooltip extends beyond the right edge of the viewport and is partially clipped. The tooltip shows the current zoom percentage. No functional impact — zoom still works correctly."
- Suggested Fix: "Reposition the tooltip to appear to the left of the slider instead of above it, or use a tooltip library that auto-detects viewport boundaries and repositions accordingly."
- Evidence: "43 sessions on viewports 1024-1199px wide"
- Status: white badge with slate dot "Open"

Issue 10:
- Severity: Slate badge "LOW" — Score: "2.8 / 10"
- Type badge: "UX Friction" (slate outline)
- Title: "Keyboard shortcut hints not visible — users type '/' to search but nothing happens"
- Page: "All pages" — Browser: "All browsers"
- First detected: "Feb 11, 2026" — Sessions affected: "38 of 8,429 (0.5%)"
- AI Description: "38 users pressed the '/' key expecting a search shortcut (common in tools like GitHub, Notion, and Linear). No search box opened. The actual keyboard shortcut for search in Planwise is Cmd+K / Ctrl+K, but this is not documented anywhere visible in the UI. These users are likely power users migrating from other tools."
- Suggested Fix: "Add '/' as an alternative shortcut for opening the search bar, matching the convention used by GitHub, Slack, and Notion. Add a subtle keyboard shortcut hint next to the search icon in the header (e.g., a small 'Cmd+K' badge). Consider adding a keyboard shortcuts help panel accessible via '?'."
- Evidence: "38 sessions · 56 total '/' keypresses with no response"
- Status: white badge with slate dot "Open"

---

PAGE 4: SESSIONS (route: "/app/sessions")

Header:
- "Analysed Sessions" (h1) + "Every session is watched by AI and summarised. Browse and filter analysed recordings." (muted)

Filter Bar (horizontal row, white card):
- Date Range: "Last 7 Days" / "Last 30 Days" / "Last 90 Days" (Last 30 Days default)
- Issues Found: "Any" / "Has Issues" / "No Issues"
- User Type: "All" / "New Users" / "Returning Users"
- Page Visited: All / /board/:id / /settings/project / /settings/team / /onboarding / /tasks / /projects/new / /timeline / /dashboard
- AI Sentiment: "All" / "Frustrated" / "Confused" / "Smooth"
All filtering is client-side using state.

Session Stats Bar (inline):
- "8,429 sessions analysed" — "3,847 with issues detected (46%)" — "4,582 smooth sessions (54%)"

Session Cards (list of white cards):

Session 1:
- Session ID: "#S-8429" — Duration: "4m 32s" — Date: "Feb 25, 2026, 9:14 AM"
- User: avatar "JW" + "James Walker" — Type: "Returning User" — Browser: "Chrome 122 / macOS" — Location: "San Francisco, US"
- AI Sentiment badge: Rose "Frustrated"
- Issues Found: "2" (rose count badge)
- Pages Visited: /dashboard → /board/proj-alpha → /board/proj-alpha (drag attempt) → /board/proj-alpha (drag attempt) → /tasks → /tasks (view toggle x3)
- AI Session Summary (light gray-bg inset): "User attempted to move a task card from 'In Progress' to 'Done' on the Kanban board three times — each attempt failed silently. After the third failed drag, the user paused for 8 seconds (likely confused), then switched to the List view and changed the task status from there. The user then toggled between List and Board view twice more, possibly checking if the board was updated. Session ended on the Tasks page in List view."
- Issues Detected: two small linked pills — "Kanban drag-and-drop fails in Firefox" (rose) and "List/Board toggle confusion" (sky)

Session 2:
- Session ID: "#S-8427" — Duration: "6m 18s" — Date: "Feb 25, 2026, 8:47 AM"
- User: avatar "LP" + "Lisa Park" — Type: "New User (Day 1)" — Browser: "Safari 17 / macOS" — Location: "Austin, US"
- AI Sentiment badge: Amber "Confused"
- Issues Found: "1" (amber count badge)
- Pages Visited: /onboarding/step-1 → /onboarding/step-2 → /onboarding/step-3 → /onboarding/step-3 (scroll down) → /onboarding/step-3 (scroll up) → exit
- AI Session Summary: "New user completed onboarding Steps 1 and 2 quickly (under 90 seconds total). At Step 3 (Team Setup), the user paused for 22 seconds, scrolled down slowly, then back up. They did not enter any email addresses. After 1 minute 40 seconds on this step, the user scrolled down to the very bottom of the page, paused for 3 seconds on the 'I'll do this later' text link, but did not click it — possibly didn't recognise it as clickable. The user then closed the tab. Onboarding was not completed."
- Issues Detected: "Onboarding Step 3 drop-off" (amber)

Session 3:
- Session ID: "#S-8421" — Duration: "2m 55s" — Date: "Feb 25, 2026, 8:12 AM"
- User: avatar "RK" + "Raj Krishnamurthy" — Type: "Returning User" — Browser: "Chrome 122 / Windows" — Location: "London, UK"
- AI Sentiment badge: Rose "Frustrated"
- Issues Found: "1" (rose count badge)
- Pages Visited: /dashboard → /settings/project → /settings/project (click Save) → /settings/project (click Save x4) → /settings/project → /dashboard
- AI Session Summary: "User navigated directly to Project Settings and changed the project name. Clicked Save — button became disabled. User waited 5 seconds, then attempted to click the disabled Save button 4 more times over 12 seconds. After a pause of 8 seconds (likely frustration), the user navigated back to the Dashboard without confirming if the change was saved. The project name change did persist (the API call succeeded), but the user had no way of knowing due to the broken button state."
- Issues Detected: "Save button disabled permanently" (rose)

Session 4:
- Session ID: "#S-8418" — Duration: "8m 44s" — Date: "Feb 24, 2026, 4:33 PM"
- User: avatar "CM" + "Chen Min" — Type: "Returning User" — Browser: "Chrome 122 / macOS" — Location: "New York, US"
- AI Sentiment badge: Emerald "Smooth"
- Issues Found: "0"
- Pages Visited: /dashboard → /tasks → /board/proj-beta → /board/proj-beta (drag task) → /board/proj-beta (drag task) → /timeline → /settings/team → /dashboard
- AI Session Summary: "Smooth session with no issues detected. User checked the dashboard, reviewed tasks in the board view, successfully moved two tasks between columns, checked the timeline, visited team settings briefly, and returned to the dashboard. All interactions performed as expected. Session duration and navigation pattern consistent with a regular check-in workflow."
- Issues Detected: "None — smooth session"

Session 5:
- Session ID: "#S-8415" — Duration: "3m 07s" — Date: "Feb 24, 2026, 3:55 PM"
- User: avatar "SK" + "Sarah Kim" — Type: "New User (Day 1)" — Browser: "Firefox 123 / Windows" — Location: "Toronto, CA"
- AI Sentiment badge: Amber "Confused"
- Issues Found: "2" (amber count badge)
- Pages Visited: /onboarding/step-1 → /onboarding/step-2 → /projects/new → /projects/new (template browsing) → /projects/new (template browsing) → exit
- AI Session Summary: "New user completed onboarding Steps 1 and 2, then skipped to 'Create Project'. On the project creation form, user entered a project name ('Marketing Campaign Tracker'), then spent 1 minute 52 seconds on the template selection grid. Clicked on 3 different templates, viewed each preview for 5-8 seconds, clicked back each time. Scrolled to the bottom of the template grid and back up. After a 6-second pause, the user closed the tab without selecting a template or creating a project."
- Issues Detected: "Create Project form abandonment" (amber) and "Template selection confusion" (sky)

Session 6:
- Session ID: "#S-8412" — Duration: "1m 23s" — Date: "Feb 24, 2026, 2:18 PM"
- User: avatar "AT" + "Aisha Toure" — Type: "Returning User" — Browser: "Chrome 122 / macOS" — Location: "Chicago, US"
- AI Sentiment badge: Rose "Frustrated"
- Issues Found: "1" (rose count badge)
- Pages Visited: /dashboard → /settings/team → /settings/team (click Invite x6) → /settings/team
- AI Session Summary: "User navigated to Team Settings and clicked the 'Invite Team Member' button. No visible response occurred. User clicked the button 5 more times rapidly over 4 seconds (rage click pattern). After a 3-second pause, a success toast appeared (for the first click), immediately followed by an error toast saying 'Invitation already sent.' The user stared at the screen for 4 seconds, then navigated away. Six duplicate invitation emails were sent to the same address."
- Issues Detected: "Invite button rage clicks — no loading state" (amber)

---

PAGE 5: INSIGHTS & TRENDS (route: "/app/insights")

Header:
- "Insights & Trends" (h1) + "AI-detected patterns, regression alerts, and performance trends across Planwise." (muted)
- Date range label at top right: "Last 30 Days" (styled as a dropdown but non-functional)

Top Metrics Row (4 white cards):
1. "Total Issues (30d)" — "142" — Trend: rose up arrow "↑ 18 from last month" — "Issue rate: 47 per 1,000 sessions"
2. "Resolved Issues" — "94" — Trend: emerald — "83% resolved within 7 days"
3. "Avg Resolution Time" — "4.2 days" — Trend: emerald down arrow "↓ 1.1 days faster than last month"
4. "User Frustration Index" — "2.4 / 10" — Trend: emerald down arrow "↓ 0.6 from last month (improving)"

Chart 1 — "Issues Detected Over Time" (Line chart, full width white card):
- X-axis: Sep 2025, Oct 2025, Nov 2025, Dec 2025, Jan 2026, Feb 2026
- Three lines:
  - "Critical + High" in rose: 18, 22, 28, 31, 24, 46
  - "Medium" in sky blue: 32, 35, 41, 48, 52, 58
  - "Low" in slate: 22, 25, 28, 31, 35, 38
- Note below: "Critical + High spike in Feb coincides with v2.4 release (Feb 3) which introduced the new Kanban board."

Chart 2 — "Issue Types Breakdown" (Donut chart, white card):
- Bugs: 28% — Rose
- Rage Clicks: 22% — Amber
- Broken Flows: 18% — Indigo
- Form Abandonment: 14% — Violet
- Dead Clicks: 10% — Sky blue
- Error States: 8% — Slate
- Center text: "142 total issues"

Chart 3 — "Resolution Velocity" (Bar chart, white card):
- Monthly bars for last 6 months showing issues resolved:
  - Sep: 48
  - Oct: 54
  - Nov: 62
  - Dec: 71
  - Jan: 84
  - Feb: 94
- Emerald coloured bars with slight gradient
- Trend label below: "Issue resolution rate improving month over month — team is closing issues 38% faster since adopting Replaydar."

Chart 4 — "Frustration Heatmap by Page" (Horizontal bar chart, white card):
- /settings/project — Frustration Index: 7.8 (rose bar)
- /board/:id — Frustration Index: 7.2 (rose bar)
- /onboarding — Frustration Index: 6.4 (amber bar)
- /settings/team — Frustration Index: 5.9 (amber bar)
- /projects/new — Frustration Index: 5.1 (amber bar)
- /tasks — Frustration Index: 3.2 (sky bar)
- /timeline — Frustration Index: 2.8 (sky bar)
- /dashboard — Frustration Index: 1.1 (emerald bar)

"Regression Alerts" Card (white card, rose left border):
- Header: "Regression Alerts" with AlertTriangle icon
- 2 regression items:
  1. Rose dot — "Kanban drag-and-drop failure rate spiked from 0% to 100% for Firefox users after v2.4 release (Feb 3, 2026). This was not present in v2.3. Likely introduced by the drag library update in commit #a3f8d2." — Detected: Feb 5
  2. Amber dot — "Onboarding Step 3 drop-off rate increased from 28% to 41% after the team setup step was redesigned on Jan 22, 2026. The previous version had a prominent 'Skip' button — the redesign replaced it with a text link below the fold." — Detected: Jan 28

"Baseline Behaviours" Card (white card, emerald left border):
- Header: "Learned Baselines" with Brain icon — "Replaydar learns what 'normal' looks like for Planwise and alerts on deviations."
- 4 baseline items:
  1. Emerald dot — "Normal session duration: 4-8 minutes. Sessions under 1 minute are flagged as potential bounces. Sessions over 15 minutes are flagged as potential confusion loops."
  2. Emerald dot — "Normal rage click rate: < 2 per 1,000 sessions per page. Current: 1.4 avg (within normal). Exception: /settings/team at 8.7 (above threshold — flagged)."
  3. Emerald dot — "Normal onboarding completion rate: 72%. Current: 59% (below threshold — flagged since Jan 28)."
  4. Emerald dot — "Normal form completion rate on /projects/new: 78%. Current: 66% (below threshold — flagged since Feb 5)."

"AI Pattern Detection" Card (white card, subtle indigo left border):
- Heading: "This Month's AI-Detected Patterns"
- 4 insight bullets with small Sparkle icons:
  - "Firefox users are 3.2x more likely to encounter critical bugs than Chrome users. Two of the three critical browser-specific issues trace back to the v2.4 drag library update."
  - "New users who complete onboarding in under 5 minutes have a 72% 30-day retention rate. Those who take longer than 8 minutes have only 31% retention. The Step 3 bottleneck is the primary cause of extended onboarding times."
  - "The 'Create Project' form has a 34% abandonment rate, but users who select the 'Simple Kanban' template have a 91% form completion rate — highest of any template. Consider making it the default or first option."
  - "Rage click frequency is highest between 2-4 PM EST, correlating with peak usage. The Invite Team Member button accounts for 41% of all rage click events platform-wide."

"Before & After Replaydar" Card (full width white card, bottom):
- Heading: "Impact Since Connecting Replaydar"
- Subtitle: "Planwise connected FullStory to Replaydar on September 1, 2025. Here's the before vs. after."
- 4 comparison rows (Before Replaydar vs After Replaydar):
  - Bugs detected per month: ~3 (from manual QA) vs 142 (AI-detected) — "47x more issues found"
  - Avg time to discover a bug: 34 days vs 2.1 hours — "99.7% faster detection"
  - Sessions reviewed: < 1% (manual) vs 100% (automated) — "Full coverage"
  - UX issues in backlog: 8 (vague, unverified) vs 142 (specific, evidence-backed) — "17x more actionable"

---

NAVIGATION BEHAVIOR:
- The sidebar should highlight the active page with an indigo left border and indigo-tinted background
- Clicking sidebar items navigates between pages
- The landing page is separate from the app pages — it has its own layout without the sidebar
- All "View Details" and navigation links should work and use the router
- All buttons labelled as non-functional should look clickable but do nothing or show a subtle tooltip saying "Demo mode"
- Mobile responsive: sidebar collapses to a hamburger menu on small screens

---

DATA BEHAVIOR:
- All data is hardcoded / static dummy data. No databases, no APIs, no real calculations.
- Numbers, dates, names, and statuses are all fake but realistic.
- The simulated product being analysed is "Planwise" — a B2B SaaS project management tool.
- Pages referenced throughout: /dashboard, /board/:id, /settings/project, /settings/team, /onboarding, /tasks, /projects/new, /timeline.
- Issue types used: Bug, Rage Click, Dead Click, Broken Flow, Form Abandonment, Error State, UX Confusion, Cosmetic, UX Friction.
- Severity levels: Critical (rose, 8-10 score), High (amber, 6-8 score), Medium (sky blue, 4-6 score), Low (slate, 1-4 score).
- Session user names: James Walker, Lisa Park, Raj Krishnamurthy, Chen Min, Sarah Kim, Aisha Toure, Marcus Chen, Olivia Santos, Tyler Brooks, Nina Petrov, Diego Ramirez.
- Browsers: Chrome 122, Firefox 123, Safari 17, Edge 122.
- Integration sources: Hotjar, FullStory (connected), LogRocket, Clarity, PostHog (connected). Two connected, three available.
- The app should feel like a real, lived-in product — not an empty shell. Every issue should cross-reference correctly between the Issues page, the Sessions page, and the Dashboard.

---

IMPORTANT NOTES:
- The landing page should feel clean and authoritative — think Linear or Vercel marketing pages in light mode.
- The dashboard and app pages should feel like a real SaaS product with rich, dense data — think Linear or Notion in light mode.
- Charts should use the app colour palette (indigo for primary data, rose for critical, emerald for success, gray for benchmarks).
- Ensure all data is internally consistent — the same issue titles, session IDs, customer names, and metrics should make sense across all pages.
- Make the site fully responsive — mobile, tablet, desktop.
- Use framer-motion sparingly: landing page scroll reveals, animated counters, hover effects on cards. Don't animate page transitions or sidebar toggles.
- All select dropdowns and dialog modals should use shadcn/ui components styled to match the light theme (white backgrounds, gray borders).
- If any page is too complex, prioritise in this order: (1) Dashboard, (2) Issues, (3) Landing Page, (4) Sessions, (5) Insights.

Refinement Prompts

Use these one at a time after the initial app generates. Paste each into Base44's chat.

Prompt R1 — Design Polish

Refine the visual design across all pages. Ensure card shadows are subtle and consistent (0 1px 3px rgba(0,0,0,0.08)). Add a subtle indigo radial glow behind the hero section headline on the landing page at very low opacity. Ensure all severity badges are pill-shaped with rounded-full corners, consistent padding (px-2.5 py-0.5), and the correct semantic colours — rose for Critical, amber for High, sky for Medium, slate for Low. Add hover effects on all clickable cards that slightly strengthen the shadow and add a faint indigo border-colour transition. Make sure the sidebar has a subtle separator line between the nav items and the integration status section at the bottom. Ensure the active sidebar item has an indigo left border accent (3px) and a light indigo background tint (#EEF2FF). All issue type badges should have an outline style (coloured border, transparent background, coloured text). The AI description and suggested fix inset cards on the Issues page should have distinct background tints — light gray (#F9FAFB) for descriptions, light indigo (#EEF2FF) for suggested fixes.

Prompt R2 — Charts and Visualizations

Add proper chart visualizations on the Insights page. Use Recharts for: a multi-line chart for Issues Detected Over Time showing Critical+High (rose), Medium (sky), and Low (slate) over 6 months. A donut chart for Issue Types Breakdown (rose for Bugs, amber for Rage Clicks, indigo for Broken Flows, violet for Form Abandonment, sky for Dead Clicks, slate for Error States). A bar chart for Resolution Velocity showing monthly bars in emerald gradient. A horizontal bar chart for Frustration Heatmap by Page using rose-to-emerald colour scale based on frustration index value. On the Dashboard page, add the horizontal stacked severity bar for Issue Distribution by Severity (rose, amber, sky, slate segments). Add small sparkline trend indicators inside each of the 4 Dashboard metric cards. All chart cards should have a white background with the standard subtle shadow.

Prompt R3 — Landing Page Animations

Add subtle scroll-triggered fade-in animations to the landing page. The hero headline should have a slight fade-up animation on page load (opacity 0 to 1, translateY 20px to 0, over 0.6 seconds). The "How It Works" step cards should stagger in as the user scrolls (each card 150ms after the previous). The feature cards should fade in with a slight scale effect (0.95 to 1.0). The floating stats card in the hero should have a count-up number animation when it enters the viewport. The integration logos row should fade in as a group. The testimonial cards should slide in from below. The hero eyebrow text should have a subtle tracking animation (letter-spacing expanding slightly on load).

Prompt R4 — Responsive Layout and Mobile

Ensure full mobile responsiveness. The landing page navigation should collapse to a hamburger menu on screens under 768px. The sidebar on app pages should collapse to a hamburger menu or slide-in overlay on mobile. All card grids (metrics, features, integrations) should stack to single column on phone screens. Charts should be horizontally scrollable on mobile if they would otherwise be too compressed. The issue cards on the Issues page should stack all sections vertically with full-width AI description and suggested fix blocks. The session cards should display the pages-visited flow as a vertical step list on mobile rather than a horizontal path. Font sizes should scale down gracefully — hero headline to text-3xl on mobile. Touch targets on mobile should be at least 44px. Filter bars should become vertically stacked dropdowns on mobile.

Prompt R5 — Activity Feed and Issue Details Polish

Add small coloured severity dots next to each issue in the Dashboard's Recent Issues Feed — rose circles for Critical, amber for High, sky for Medium, slate for Low. In the Sessions page, add a small user avatar circle with initials next to each session card header. Add a small colour-coded sentiment indicator next to each session's AI summary — a rose dot for Frustrated, amber for Confused, emerald for Smooth. On the Issues page, add a small "Evidence" section at the bottom of each issue card with a horizontal row of mini-stat pills: sessions affected count, total event count (clicks/errors), and average retries — each in a small rounded pill with muted background. Add subtle alternating background tints on the session cards for better visual separation (alternate between white and #FAFAFA).

Final Touch Prompts

Prompt F1 — Demo Mode Indicator

Add a subtle "Demo Mode" indicator badge in the top-right corner of every app page (not the landing page). It should be a small pill badge with an indigo outline and the text "Demo Mode" in gray-500 text on a white background. It should be fixed-position so it stays visible while scrolling. Add a small info icon before the text. On hover, show a tooltip: "This is a demo with sample data from Planwise, a fictional SaaS product."

Prompt F2 — Toast Notifications

When any non-functional button is clicked (like "Manage Integrations", filter dropdowns that don't actually filter, "Export" buttons, etc.), show a small toast notification that slides in from the bottom-right corner saying "This feature is available in the full version." The toast should auto-dismiss after 3 seconds and have a white background with a subtle shadow, gray-700 text, and a small indigo left border (3px). Add a subtle slide-up entrance animation and fade-out exit animation. Use the Sonner or shadcn toast component if available.