Platform Overview
Dashboard & Projects
The Dashboard is the central hub for managing all research projects. Projects are organized by status into collapsible sections, making it easy to focus on active work while keeping historical data accessible.
Project Statuses
| Status | Badge Color | Description |
|---|---|---|
| Draft | Gray | Projects in planning phase, not yet actively collecting data |
| Active | Green | Currently running studies with ongoing data collection |
| Completed | Blue | Finished studies with all data collected |
| Archived | Yellow | Historical projects preserved for reference |
Project Modes
-
Moderated: A researcher guides participants through tasks in real-time, allowing for follow-up questions and observation
-
Unmoderated: Participants complete tasks independently, typically with screen recording for later review
Project Card Metrics
Each project card displays key metrics at a glance:
-
Participants: Number of participants enrolled in the study
-
Tasks: Number of tasks defined for the study
-
Insights: Count of sticky notes/observations captured
-
Sessions: Completed sessions out of total planned sessions
Recording Options
Projects can be configured with camera and microphone settings that control how participant sessions are recorded.
| Setting | Behavior |
|---|---|
| Optional | Participants can choose whether to enable the device |
| Required | Participants must enable the device to proceed |
| Disabled | The device will not be requested or used |
Camera controls screen recording capability. Microphone controls audio capture for think-aloud protocols.
Tags
Tags are labels you apply to projects to categorize and group them. Tags serve two key purposes:
-
Organization: Group related projects (e.g., "Q4-2025", "Mobile App", "Onboarding")
-
Trend Analysis: Enable cross-project NPS and SUS comparison for all projects sharing a tag
Tip: Use consistent tag naming to enable meaningful trend analysis across related research cycles.
NPS (Net Promoter Score)
Net Promoter Score measures customer loyalty by asking: "How likely are you to recommend this product to a friend or colleague?" Responses are on a 0-10 scale.
Response Categories
-
Promoters (9-10): Enthusiastic users likely to recommend the product
-
Passives (7-8): Satisfied but not enthusiastic; vulnerable to competitors
-
Detractors (0-6): Unhappy users who may damage brand through negative word-of-mouth
NPS Calculation
NPS = % Promoters − % Detractors
The score ranges from -100 (all detractors) to +100 (all promoters).
1.6 SUS (System Usability Scale)
The System Usability Scale is a standardized 10-question survey that produces a score from 0-100, measuring perceived usability of a system.
Score Categories
| Score Range | Grade | Interpretation |
|---|---|---|
| 80-100 | Excellent | Best imaginable usability |
| 68-79 | Good | Above average usability |
| 51-67 | OK | Below average usability |
| 0-50 | Poor | Significant usability problems |
Note: The industry average SUS score is approximately 68. Scores above this indicate above-average usability.
Trend Analysis
Trend Analysis allows you to track NPS and SUS scores across multiple projects that share the same tag. This reveals how user satisfaction and usability evolve over time or across different product areas.
Key features
-
Filter by tag to compare related projects
-
Line charts show score progression across projects
-
Summary statistics including overall averages and trend direction
-
Breakdown tables with per-project details
-
Only Active and Completed projects are included in trend analysis
Research Questions & Hypotheses
Research Questions are the foundational inquiries that guide your study. They define what you're trying to learn and help organize hypotheses into logical groups.
Hypotheses are testable assumptions about user behavior, needs, or pain points. Each hypothesis is linked to a research question and tracked through validation states as you gather evidence.
Validation Statuses
-
Testing: Currently gathering evidence
-
Validated: Evidence strongly supports (>70%)
-
Disproven: Evidence contradicts (<30%)
-
Unclear: Mixed evidence (30-70%)
Tasks & Affinity Mapping
Tasks are structured activities participants complete during usability testing. They include scenarios, step-by-step instructions, success criteria, and follow-up questions.
Affinity Mapping is a pattern recognition technique where observations, insights, and quotes are organized into thematic clusters. Sticky notes come in four types: Insight (yellow), Opportunity (green), Barrier (rose), and Quote (purple).
Part 2: How-To Guides
Step-by-step procedures for common tasks.
2.1 Creating a Project
-
From the Dashboard, click "New Project" in the top-right corner
-
Enter a project name and optional description
-
Select the initial status (Draft is recommended for new projects)
-
Choose the study mode: Moderated or Unmoderated
-
Set optional start and end dates for the research period
-
Configure recording options for camera and microphone
-
Add tags to enable trend analysis (type and press Enter, or click Add)
-
Click "Create" to save the project
2.2 Duplicating a Project
Duplicate an existing project to reuse its configuration for a new study cycle.
-
Find the project card on the Dashboard
-
Click the copy icon in the top-right of the card
-
The Create Project dialog opens with all settings pre-filled
-
The name is automatically appended with "(Copy)"
-
Modify any settings as needed
-
Click "Create" to save the duplicated project
Note: Duplicating copies tasks, tags, messages, and recording settings. Participants are not copied.
2.3 Managing Project Tags
To add tags during project creation:
-
Type the tag name in the Tags field
-
Press Enter or click "Add"
-
Tags appear below the input field
-
Click a tag's X icon to remove it
Tag best practices:
-
Use consistent naming conventions (e.g., "Q4-2025" not "q4 2025")
-
Apply the same tags to projects you want to compare in Trend Analysis
-
Consider tags for: time periods, product areas, user segments, research themes
2.4 Using Trend Analysis
-
Navigate to Trend Analysis from the sidebar
-
View available tags from your Active and Completed projects
-
Click a tag button to filter projects
-
View the NPS Trend Analysis chart and data table
-
Scroll down to view the SUS Trend Analysis
-
Click "Clear Filter" to reset and select a different tag
Prerequisites: Projects must have the same tag, be Active or Completed status, and have participants with NPS/SUS scores recorded.
2.5 Creating Hypotheses
-
Navigate to the Hypotheses tab in your project
-
Click "Add Hypothesis"
-
Select a research question or create one inline
-
Enter the hypothesis title and statement
-
Choose category and relevant user segments
-
Set status and priority
-
Add expected evidence and testing methodology
-
Click "Add Hypothesis" to save
2.6 Creating Tasks
The Task Editor uses a 4-step wizard:
Step 1 - Basic Info: Title (required), estimated time, difficulty level, linked hypotheses
Step 2 - Task Content: Objective, scenario, numbered steps
Step 3 - Criteria: Success criteria definition, optional rating scale
Step 4 - Questions: Custom follow-up questions from bank or custom
2.7 Working with Affinity Maps
To add clusters:
-
Click "Add Cluster" in the Synthesis tab
-
Select predefined clusters or create custom ones
-
Click "Add Selected"
To add and organize notes:
-
Click "Add Note" and fill in text, type, and cluster
-
Drag notes between clusters to reorganize
-
Delete empty clusters using the trash icon on hover
Part 3: Reference
Quick reference tables for fields, options, and configurations.
3.1 Project Fields Reference
| Field | Required | Description |
|---|---|---|
| Name | Yes | Descriptive project title |
| Description | No | Brief overview of research goals |
| Status | Yes | Draft, Active, Completed, or Archived |
| Mode | Yes | Moderated or Unmoderated |
| Start Date | No | When the study begins |
| End Date | No | When the study ends |
| Camera Option | No | Optional, Required, or Disabled |
| Microphone Option | No | Optional, Required, or Disabled |
| Tags | No | Labels for organization and trend analysis |
3.2 NPS Score Interpretation
| Score Range | Assessment | Meaning |
|---|---|---|
| 50 to 100 | Excellent | World-class customer loyalty |
| 30 to 49 | Good | Strong loyalty, room for improvement |
| 0 to 29 | Needs Improvement | Neutral to slightly positive |
| -100 to -1 | Critical | More detractors than promoters |
3.3 SUS Score Interpretation
| Score Range | Grade | Interpretation |
|---|---|---|
| 80-100 | Excellent | Best imaginable usability |
| 68-79 | Good | Above average usability |
| 51-67 | OK | Below average usability |
| 0-50 | Poor | Worst imaginable usability |
3.4 Hypothesis Fields Reference
| Field | Required | Description |
|---|---|---|
| Research Question | Recommended | Links hypothesis to parent question |
| Hypothesis Title | Yes | Short name for reference |
| Statement | No | Full hypothesis with context |
| Category | Yes | Primary Barriers, Workflow, Usability, Organizational |
| Segments | No | Non-Users, Abandoned, Occasional, Active |
| Status | Yes | Testing, Validated, Disproven, Unclear |
| Priority | Yes | High, Medium, Low |
| Expected Evidence | No | What observations would support this |
| How to Test | No | Validation methods |
| Actual Evidence | Yes | Collected evidence points |
| Supporting Evidence | No | Additional context and quotes |
3.5 Task Fields Reference
| Field | Required | Description |
|---|---|---|
| Title | Yes | Descriptive task name |
| Estimated Time | No | 1-2 min to 20+ min options |
| Difficulty | No | Easy, Medium, Hard, or All Users |
| Objective | No | One-sentence goal |
| Scenario | No | Context paragraph (max 600 chars) |
| Steps | No | Numbered instructions |
| Success Criteria | No | Definition of completion |
| Rating Scale | No | Enable 1-5 difficulty rating |
| Questions | No | Custom follow-up questions |
| Linked Hypotheses | No | Hypotheses this task validates |
3.6 Validation Thresholds
Quantitative Thresholds:
-
>70% of participants mention the issue → VALIDATED
-
<30% of participants mention the issue → DISPROVEN
-
30-70% of participants mention the issue → UNCLEAR
Usability Benchmarks:
-
Task success <50%: Major usability issue
-
Time >5 min for basic tasks: Efficiency problem
-
SEQ scores <4: Poor task experience
3.7 Predefined Clusters
15 predefined clusters are available:
-
Onboarding Barriers, Template Pain Points, Documentation Gaps
-
What Works Well, Emerging Opportunities, User Frustrations
-
Feature Requests, Workflow Issues, Learning Curve
-
Integration Challenges, Performance Concerns, UI/UX Feedback
-
Communication Gaps, Success Stories, Quick Wins
Part 4: How It Works
Technical explanations of system behavior.
4.1 Dashboard Organization
The Dashboard organizes projects into four sections based on status:
-
Draft Projects: Displayed first, slightly dimmed (opacity 80%)
-
Active Projects: Full opacity, primary focus
-
Completed Projects: Collapsible section, open by default
-
Archived Projects: Collapsible section, closed by default
Each project card displays a 3×3 grid on larger screens, responsive to 2 columns on tablets and single column on mobile.
4.2 NPS Calculation
NPS is calculated from participant scores as follows:
-
Count participants with scores 9-10 → Promoters
-
Count participants with scores 7-8 → Passives
-
Count participants with scores 0-6 → Detractors
-
Calculate Promoter % = (Promoters / Total) × 100
-
Calculate Detractor % = (Detractors / Total) × 100
-
NPS = Promoter % − Detractor %
Note: Passives are counted for totals but don't directly affect the NPS formula.
4.3 SUS Score Aggregation
For project-level and trend analysis, SUS scores are aggregated:
-
Individual participant SUS scores are already calculated (0-100 scale)
-
Project average = Sum of all participant SUS scores / Number of participants
-
Participants are categorized by their individual scores for the pie chart
4.4 Trend Analysis Logic
Trend Analysis aggregates data across projects:
-
Filter projects by selected tag
-
Include only Active and Completed projects
-
For each project, calculate aggregate NPS and average SUS
-
Sort projects by start date (chronologically)
-
Display line chart with projects on X-axis, score on Y-axis
-
Calculate trend direction by comparing first and last project scores
Trend Direction:
-
Improving: Last score > First score
-
Declining: Last score < First score
-
Stable: Last score = First score
4.5 Participant ID Assignment
Participants are automatically assigned anonymous IDs (P01, P02, etc.) based on the order they are added to a project. The underlying participant name is preserved separately for researcher reference.
4.6 User Segment Mapping
| Usage Level | Segment Name | Task Difficulty |
|---|---|---|
| active | Active | Hard |
| occasional | Occasional | Medium |
| (other) | Non-User | Easy |
4.7 Project Duplication
When duplicating a project, the following are copied:
-
Name (with "(Copy)" appended), description, mode
-
Tags, tasks, research goals
-
Before/during/after messages
-
Camera and microphone options
Not copied: Participants, dates (reset to blank), status (set to Active)
Part 5: Troubleshooting
Common issues and their solutions.
5.1 Dashboard Issues
Issue: Project not appearing in expected section
Cause: Project has a different status than expected.
Solution: Check the Completed and Archived sections (expand them if collapsed). Edit the project to change its status if needed.
Issue: Cannot see Completed or Archived projects
Cause: These sections are collapsible and may be closed.
Solution: Click the section header to expand. Completed is open by default; Archived is closed by default.
Issue: Project metrics showing 0 for all values
Cause: No participants, tasks, or insights have been added yet.
Solution: Click the project card to enter it and add participants, tasks, and observations.
5.2 Trend Analysis Issues
Issue: "No trend data available" message
Cause: No tags exist on any Active or Completed projects.
Solution: Add tags to your projects. Tags enable grouping for trend comparison.
Issue: Tag filter shows no data after selection
Cause: Projects with that tag have no NPS/SUS scores recorded.
Solution: Add NPS and SUS scores to participants in the Participants tab of projects with that tag.
Issue: Only some projects appear in trend chart
Cause: Projects must be Active or Completed status AND have at least one participant with scores.
Solution: Check that Draft/Archived projects are changed to Active or Completed. Verify participants have scores entered.
Issue: Chart not updating after adding new project data
Cause: The page may need to refresh to reflect new data.
Solution: Click a different tag then click your tag again, or refresh the browser page.
5.3 NPS & SUS Issues
Issue: NPS pie chart is empty
Cause: No participants have NPS scores recorded.
Solution: Enter NPS scores (0-10) for participants in the Participants tab.
Issue: SUS average showing unexpected value
Cause: SUS scores may have been entered incorrectly or include outliers.
Solution: Review individual participant SUS scores. Valid scores range 0-100. Check the raw SUS questionnaire calculations.
Issue: NPS calculation seems incorrect
Cause: Misunderstanding of NPS formula.
Solution: Remember that Passives (7-8) count in the denominator but don't appear in the formula. NPS = %Promoters - %Detractors only.
5.4 Hypothesis Issues
Issue: Cannot delete a research question
Cause: Hypotheses are still linked to it.
Solution: Move or delete all linked hypotheses first.
Issue: Import from Library shows no data
Cause: Global library is empty or failed to load.
Solution: Add items to global library first, or check network connectivity.
5.5 Task Issues
Issue: Task import validation errors
Cause: JSON doesn't match expected format.
Solution: Download the template and compare structure. Ensure title exists, difficulty is valid, question types are correct.
Issue: Segment mismatch warning
Cause: Task difficulty doesn't align with linked hypothesis segments.
Solution: This is a warning only. Review whether the task reaches appropriate participants, or adjust difficulty.
5.6 General Issues
Issue: Changes not persisting after refresh
Cause: API calls may have failed.
Solution: Check for error toast notifications. Open browser dev tools (F12) to check Console and Network tabs.
Issue: Sidebar covering content on mobile
Cause: Sidebar may be expanded.
Solution: Click the collapse button to minimize the sidebar, or navigate using the sidebar then close it.
Issue: Dialog closes unexpectedly
Cause: Clicking outside the dialog or pressing Escape closes it.
Solution: Be careful to click only within the dialog. Data is not saved until you click the save button.