Skip to main content

Platform Overview

Dashboard & Projects

The Dashboard is the central hub for managing all research projects. Projects are organized by status into collapsible sections, making it easy to focus on active work while keeping historical data accessible.

Project Statuses

StatusBadge ColorDescription
DraftGrayProjects in planning phase, not yet actively collecting data
ActiveGreenCurrently running studies with ongoing data collection
CompletedBlueFinished studies with all data collected
ArchivedYellowHistorical projects preserved for reference

Project Modes

  • Moderated: A researcher guides participants through tasks in real-time, allowing for follow-up questions and observation

  • Unmoderated: Participants complete tasks independently, typically with screen recording for later review

Project Card Metrics

Each project card displays key metrics at a glance:

  • Participants: Number of participants enrolled in the study

  • Tasks: Number of tasks defined for the study

  • Insights: Count of sticky notes/observations captured

  • Sessions: Completed sessions out of total planned sessions

Recording Options

Projects can be configured with camera and microphone settings that control how participant sessions are recorded.

SettingBehavior
OptionalParticipants can choose whether to enable the device
RequiredParticipants must enable the device to proceed
DisabledThe device will not be requested or used

Camera controls screen recording capability. Microphone controls audio capture for think-aloud protocols.

Tags

Tags are labels you apply to projects to categorize and group them. Tags serve two key purposes:

  • Organization: Group related projects (e.g., "Q4-2025", "Mobile App", "Onboarding")

  • Trend Analysis: Enable cross-project NPS and SUS comparison for all projects sharing a tag

Tip: Use consistent tag naming to enable meaningful trend analysis across related research cycles.

NPS (Net Promoter Score)

Net Promoter Score measures customer loyalty by asking: "How likely are you to recommend this product to a friend or colleague?" Responses are on a 0-10 scale.

Response Categories

  • Promoters (9-10): Enthusiastic users likely to recommend the product

  • Passives (7-8): Satisfied but not enthusiastic; vulnerable to competitors

  • Detractors (0-6): Unhappy users who may damage brand through negative word-of-mouth

NPS Calculation

NPS = % Promoters − % Detractors

The score ranges from -100 (all detractors) to +100 (all promoters).

1.6 SUS (System Usability Scale)

The System Usability Scale is a standardized 10-question survey that produces a score from 0-100, measuring perceived usability of a system.

Score Categories

Score RangeGradeInterpretation
80-100ExcellentBest imaginable usability
68-79GoodAbove average usability
51-67OKBelow average usability
0-50PoorSignificant usability problems

Note: The industry average SUS score is approximately 68. Scores above this indicate above-average usability.

Trend Analysis

Trend Analysis allows you to track NPS and SUS scores across multiple projects that share the same tag. This reveals how user satisfaction and usability evolve over time or across different product areas.

Key features

  • Filter by tag to compare related projects

  • Line charts show score progression across projects

  • Summary statistics including overall averages and trend direction

  • Breakdown tables with per-project details

  • Only Active and Completed projects are included in trend analysis

Research Questions & Hypotheses

Research Questions are the foundational inquiries that guide your study. They define what you're trying to learn and help organize hypotheses into logical groups.

Hypotheses are testable assumptions about user behavior, needs, or pain points. Each hypothesis is linked to a research question and tracked through validation states as you gather evidence.

Validation Statuses

  • Testing: Currently gathering evidence

  • Validated: Evidence strongly supports (>70%)

  • Disproven: Evidence contradicts (<30%)

  • Unclear: Mixed evidence (30-70%)

Tasks & Affinity Mapping

Tasks are structured activities participants complete during usability testing. They include scenarios, step-by-step instructions, success criteria, and follow-up questions.

Affinity Mapping is a pattern recognition technique where observations, insights, and quotes are organized into thematic clusters. Sticky notes come in four types: Insight (yellow), Opportunity (green), Barrier (rose), and Quote (purple).

Part 2: How-To Guides

Step-by-step procedures for common tasks.

2.1 Creating a Project

  1. From the Dashboard, click "New Project" in the top-right corner

  2. Enter a project name and optional description

  3. Select the initial status (Draft is recommended for new projects)

  4. Choose the study mode: Moderated or Unmoderated

  5. Set optional start and end dates for the research period

  6. Configure recording options for camera and microphone

  7. Add tags to enable trend analysis (type and press Enter, or click Add)

  8. Click "Create" to save the project

2.2 Duplicating a Project

Duplicate an existing project to reuse its configuration for a new study cycle.

  1. Find the project card on the Dashboard

  2. Click the copy icon in the top-right of the card

  3. The Create Project dialog opens with all settings pre-filled

  4. The name is automatically appended with "(Copy)"

  5. Modify any settings as needed

  6. Click "Create" to save the duplicated project

Note: Duplicating copies tasks, tags, messages, and recording settings. Participants are not copied.

2.3 Managing Project Tags

To add tags during project creation:

  1. Type the tag name in the Tags field

  2. Press Enter or click "Add"

  3. Tags appear below the input field

  4. Click a tag's X icon to remove it

Tag best practices:

  • Use consistent naming conventions (e.g., "Q4-2025" not "q4 2025")

  • Apply the same tags to projects you want to compare in Trend Analysis

  • Consider tags for: time periods, product areas, user segments, research themes

2.4 Using Trend Analysis

  1. Navigate to Trend Analysis from the sidebar

  2. View available tags from your Active and Completed projects

  3. Click a tag button to filter projects

  4. View the NPS Trend Analysis chart and data table

  5. Scroll down to view the SUS Trend Analysis

  6. Click "Clear Filter" to reset and select a different tag

Prerequisites: Projects must have the same tag, be Active or Completed status, and have participants with NPS/SUS scores recorded.

2.5 Creating Hypotheses

  1. Navigate to the Hypotheses tab in your project

  2. Click "Add Hypothesis"

  3. Select a research question or create one inline

  4. Enter the hypothesis title and statement

  5. Choose category and relevant user segments

  6. Set status and priority

  7. Add expected evidence and testing methodology

  8. Click "Add Hypothesis" to save

2.6 Creating Tasks

The Task Editor uses a 4-step wizard:

Step 1 - Basic Info: Title (required), estimated time, difficulty level, linked hypotheses

Step 2 - Task Content: Objective, scenario, numbered steps

Step 3 - Criteria: Success criteria definition, optional rating scale

Step 4 - Questions: Custom follow-up questions from bank or custom

2.7 Working with Affinity Maps

To add clusters:

  1. Click "Add Cluster" in the Synthesis tab

  2. Select predefined clusters or create custom ones

  3. Click "Add Selected"

To add and organize notes:

  • Click "Add Note" and fill in text, type, and cluster

  • Drag notes between clusters to reorganize

  • Delete empty clusters using the trash icon on hover

Part 3: Reference

Quick reference tables for fields, options, and configurations.

3.1 Project Fields Reference

FieldRequiredDescription
NameYesDescriptive project title
DescriptionNoBrief overview of research goals
StatusYesDraft, Active, Completed, or Archived
ModeYesModerated or Unmoderated
Start DateNoWhen the study begins
End DateNoWhen the study ends
Camera OptionNoOptional, Required, or Disabled
Microphone OptionNoOptional, Required, or Disabled
TagsNoLabels for organization and trend analysis

3.2 NPS Score Interpretation

Score RangeAssessmentMeaning
50 to 100ExcellentWorld-class customer loyalty
30 to 49GoodStrong loyalty, room for improvement
0 to 29Needs ImprovementNeutral to slightly positive
-100 to -1CriticalMore detractors than promoters

3.3 SUS Score Interpretation

Score RangeGradeInterpretation
80-100ExcellentBest imaginable usability
68-79GoodAbove average usability
51-67OKBelow average usability
0-50PoorWorst imaginable usability

3.4 Hypothesis Fields Reference

FieldRequiredDescription
Research QuestionRecommendedLinks hypothesis to parent question
Hypothesis TitleYesShort name for reference
StatementNoFull hypothesis with context
CategoryYesPrimary Barriers, Workflow, Usability, Organizational
SegmentsNoNon-Users, Abandoned, Occasional, Active
StatusYesTesting, Validated, Disproven, Unclear
PriorityYesHigh, Medium, Low
Expected EvidenceNoWhat observations would support this
How to TestNoValidation methods
Actual EvidenceYesCollected evidence points
Supporting EvidenceNoAdditional context and quotes

3.5 Task Fields Reference

FieldRequiredDescription
TitleYesDescriptive task name
Estimated TimeNo1-2 min to 20+ min options
DifficultyNoEasy, Medium, Hard, or All Users
ObjectiveNoOne-sentence goal
ScenarioNoContext paragraph (max 600 chars)
StepsNoNumbered instructions
Success CriteriaNoDefinition of completion
Rating ScaleNoEnable 1-5 difficulty rating
QuestionsNoCustom follow-up questions
Linked HypothesesNoHypotheses this task validates

3.6 Validation Thresholds

Quantitative Thresholds:

  • >70% of participants mention the issue → VALIDATED

  • <30% of participants mention the issue → DISPROVEN

  • 30-70% of participants mention the issue → UNCLEAR

Usability Benchmarks:

  • Task success <50%: Major usability issue

  • Time >5 min for basic tasks: Efficiency problem

  • SEQ scores <4: Poor task experience

3.7 Predefined Clusters

15 predefined clusters are available:

  • Onboarding Barriers, Template Pain Points, Documentation Gaps

  • What Works Well, Emerging Opportunities, User Frustrations

  • Feature Requests, Workflow Issues, Learning Curve

  • Integration Challenges, Performance Concerns, UI/UX Feedback

  • Communication Gaps, Success Stories, Quick Wins

Part 4: How It Works

Technical explanations of system behavior.

4.1 Dashboard Organization

The Dashboard organizes projects into four sections based on status:

  • Draft Projects: Displayed first, slightly dimmed (opacity 80%)

  • Active Projects: Full opacity, primary focus

  • Completed Projects: Collapsible section, open by default

  • Archived Projects: Collapsible section, closed by default

Each project card displays a 3×3 grid on larger screens, responsive to 2 columns on tablets and single column on mobile.

4.2 NPS Calculation

NPS is calculated from participant scores as follows:

  1. Count participants with scores 9-10 → Promoters

  2. Count participants with scores 7-8 → Passives

  3. Count participants with scores 0-6 → Detractors

  4. Calculate Promoter % = (Promoters / Total) × 100

  5. Calculate Detractor % = (Detractors / Total) × 100

  6. NPS = Promoter % − Detractor %

Note: Passives are counted for totals but don't directly affect the NPS formula.

4.3 SUS Score Aggregation

For project-level and trend analysis, SUS scores are aggregated:

  • Individual participant SUS scores are already calculated (0-100 scale)

  • Project average = Sum of all participant SUS scores / Number of participants

  • Participants are categorized by their individual scores for the pie chart

4.4 Trend Analysis Logic

Trend Analysis aggregates data across projects:

  1. Filter projects by selected tag

  2. Include only Active and Completed projects

  3. For each project, calculate aggregate NPS and average SUS

  4. Sort projects by start date (chronologically)

  5. Display line chart with projects on X-axis, score on Y-axis

  6. Calculate trend direction by comparing first and last project scores

Trend Direction:

  • Improving: Last score > First score

  • Declining: Last score < First score

  • Stable: Last score = First score

4.5 Participant ID Assignment

Participants are automatically assigned anonymous IDs (P01, P02, etc.) based on the order they are added to a project. The underlying participant name is preserved separately for researcher reference.

4.6 User Segment Mapping

Usage LevelSegment NameTask Difficulty
activeActiveHard
occasionalOccasionalMedium
(other)Non-UserEasy

4.7 Project Duplication

When duplicating a project, the following are copied:

  • Name (with "(Copy)" appended), description, mode

  • Tags, tasks, research goals

  • Before/during/after messages

  • Camera and microphone options

Not copied: Participants, dates (reset to blank), status (set to Active)

Part 5: Troubleshooting

Common issues and their solutions.

5.1 Dashboard Issues

Issue: Project not appearing in expected section

Cause: Project has a different status than expected.

Solution: Check the Completed and Archived sections (expand them if collapsed). Edit the project to change its status if needed.

Issue: Cannot see Completed or Archived projects

Cause: These sections are collapsible and may be closed.

Solution: Click the section header to expand. Completed is open by default; Archived is closed by default.

Issue: Project metrics showing 0 for all values

Cause: No participants, tasks, or insights have been added yet.

Solution: Click the project card to enter it and add participants, tasks, and observations.

5.2 Trend Analysis Issues

Issue: "No trend data available" message

Cause: No tags exist on any Active or Completed projects.

Solution: Add tags to your projects. Tags enable grouping for trend comparison.

Issue: Tag filter shows no data after selection

Cause: Projects with that tag have no NPS/SUS scores recorded.

Solution: Add NPS and SUS scores to participants in the Participants tab of projects with that tag.

Issue: Only some projects appear in trend chart

Cause: Projects must be Active or Completed status AND have at least one participant with scores.

Solution: Check that Draft/Archived projects are changed to Active or Completed. Verify participants have scores entered.

Issue: Chart not updating after adding new project data

Cause: The page may need to refresh to reflect new data.

Solution: Click a different tag then click your tag again, or refresh the browser page.

5.3 NPS & SUS Issues

Issue: NPS pie chart is empty

Cause: No participants have NPS scores recorded.

Solution: Enter NPS scores (0-10) for participants in the Participants tab.

Issue: SUS average showing unexpected value

Cause: SUS scores may have been entered incorrectly or include outliers.

Solution: Review individual participant SUS scores. Valid scores range 0-100. Check the raw SUS questionnaire calculations.

Issue: NPS calculation seems incorrect

Cause: Misunderstanding of NPS formula.

Solution: Remember that Passives (7-8) count in the denominator but don't appear in the formula. NPS = %Promoters - %Detractors only.

5.4 Hypothesis Issues

Issue: Cannot delete a research question

Cause: Hypotheses are still linked to it.

Solution: Move or delete all linked hypotheses first.

Issue: Import from Library shows no data

Cause: Global library is empty or failed to load.

Solution: Add items to global library first, or check network connectivity.

5.5 Task Issues

Issue: Task import validation errors

Cause: JSON doesn't match expected format.

Solution: Download the template and compare structure. Ensure title exists, difficulty is valid, question types are correct.

Issue: Segment mismatch warning

Cause: Task difficulty doesn't align with linked hypothesis segments.

Solution: This is a warning only. Review whether the task reaches appropriate participants, or adjust difficulty.

5.6 General Issues

Issue: Changes not persisting after refresh

Cause: API calls may have failed.

Solution: Check for error toast notifications. Open browser dev tools (F12) to check Console and Network tabs.

Issue: Sidebar covering content on mobile

Cause: Sidebar may be expanded.

Solution: Click the collapse button to minimize the sidebar, or navigate using the sidebar then close it.

Issue: Dialog closes unexpectedly

Cause: Clicking outside the dialog or pressing Escape closes it.

Solution: Be careful to click only within the dialog. Data is not saved until you click the save button.