Survey Best Practices
Practical advice for getting more out of LogRocket Surveys — what works, what doesn't, and how teams actually use this in practice.
Common use cases
Different teams use Surveys for different things. Here are the patterns we see work best.
Onboarding feedback — Survey new users during or right after onboarding to find out where they got stuck. Target users who recently signed up with a filter like "First session" or "Visited URL contains /onboarding." A rating question gives you a trackable score; a conditional open text follow-up for low ratings captures actionable detail. Pair the responses with session replays to see exactly where the friction happens — you'll often find issues that users can't articulate but that are obvious on video.
NPS and satisfaction tracking — Run a standing NPS or satisfaction survey targeted at active users. Chart your score over time, correlate changes with product releases, and measure whether improvements actually move the needle. Teams that run NPS continuously (rather than quarterly) catch regressions faster and can respond before they become trends.
Feature prioritization — When you're deciding what to build next, let your users weigh in. A ranking question with 4–6 candidate features shows you relative preferences, not just "what sounds cool." Or use a multiple choice question like "Which of these features would you find most valuable?" to get quick signal. Targeting this survey at your most engaged users (via session count or usage-based traits) ensures the input comes from people who actually use the product.
Churn and exit surveys — Target users on your cancellation or downgrade page with a survey asking why they're leaving. A single choice question with common reasons (too expensive, missing features I need, switching to competitor, too complicated to use) gives you structured data you can track over time. A conditional open text follow-up captures nuance. The session replays for each response shows you the user's last experience before they decided to leave — sometimes that reveals a usability issue that, if fixed, could have retained them.
Post-interaction feedback — Trigger a short survey after a specific action: completing a support interaction, finishing a checkout, publishing a project, or reaching a milestone. Keep it to 1–2 questions to capture the sentiment while the experience is fresh. A rating question is usually enough, with an optional open text field for users who want to elaborate.
Tips for better surveys
Keep it under 5 questions. This is the single biggest lever for completion rates. Every question you add costs you respondents — you can see it directly in your drop-off funnel. If you need to ask more than 5, split into two targeted surveys instead of one long one.
Start with a low-effort question. Lead with a rating scale or single choice question. Save open text questions for later in the flow — once a user has invested a few clicks, they're more likely to write a thoughtful response.
Use conditional logic to stay relevant. Don't ask everyone the same questions. If a user rates their experience 5 stars, don't follow up with "What went wrong?" — skip to "What do you like most?" or straight to the end screen. Conditional surveys feel personal and respectful of the user's time. See Creating Surveys > Conditional questions for setup details.
Write from the user's perspective. Instead of "Rate our onboarding flow" (internal jargon), try "How easy was it to get started?" Users respond better to language that reflects their experience, not your product's internal structure.
Be specific with choice options. Don't rely on "Other" as a primary catch-all. Include the 3–5 most likely answers based on what you already know from support tickets, session replays, or previous surveys. Add "Other" with a free-text input as a supplement to catch what you missed.
Test before you launch. Always use the Preview modal to step through every question, including all conditional branches. It's easy to create a logic path that accidentally skips a question or dead-ends the survey. See Creating Surveys.
Target thoughtfully. A survey shown at the wrong moment gets dismissed — and dismissed surveys don't come back. Target users who have had enough time to form an opinion: people who've reached your dashboard, not your landing page. Don't show surveys on the very first page load. Give users a reason to have something to say first.
Watch the replays. We can't stress this enough. The most common mistake we see is teams collecting hundreds of responses and never clicking into a single replay. The survey tells you what; the replay tells you why. Make it a weekly habit: pick 5–10 of your lowest-rated responses and watch what happened. You'll learn more in 20 minutes than from a week of staring at charts.
Walkthrough: Building an onboarding feedback survey
Here's how we'd actually build an onboarding survey from scratch — going deeper than the Surveys Quick Start to show how the pieces fit together.
The goal: Figure out where new users struggle during onboarding so you know what to fix first.
Plan your questions
Before opening the builder, sketch out what you want to learn:
- How easy was the overall experience? (quantitative — trackable over time)
- Where specifically did they struggle? (structured — actionable categories)
- Anything else? (qualitative — catches surprises)
Build the survey
You could start with the Onboarding experience template, which has a similar structure, or build from scratch:
-
Welcome screen — Title: "Quick question about your setup." Description: "Help us improve onboarding." Button: "Sure."
-
Rating — Title: "How easy was it to get started?" Scale: 1–5 stars.
-
Single choice (conditional: show only if rating is less than 4) — Title: "What was the hardest part?" Options: "Installing the SDK," "Understanding the dashboard," "Setting up my first project," "Something else." This only appears for users who had a rough time, keeping the survey short for satisfied users.
-
Open text — Title: "Any other thoughts?" This appears for everyone and catches feedback that doesn't fit the structured options.
-
End screen — Title: "Thanks!" Body: "Your feedback helps us make onboarding better for everyone."
Configure design and audience
- In the Design tab, set your brand's primary color and choose "Steps" as the progress bar style so users can see how many questions are left.
- In the Audience tab, add a "First session" filter so only new users see the survey. Set the targeting percentage to 50% to start — you can increase it once you've verified the survey works well.
Activate and monitor
Save, preview (always preview), then activate from the Surveys list. Check the dashboard after a day or two. You're looking for a completion rate above 70% — that's a good benchmark for a short survey. Below that? Check the drop-off funnel to see which question is losing people.
Act on the data
Once you have 50+ responses, here's the workflow:
- Check the rating trend in the Summary section. Is the average score stable, improving, or declining?
- Review the choice breakdown to see which onboarding step causes the most friction. If "Installing the SDK" dominates, that's a clear signal to invest in better install docs or a guided setup wizard.
- Search individual responses for keywords like "confusing," "broken," or "error." These surface specific issues that the structured questions might miss.
- Watch session replays for your lowest-rated responses. Two minutes of video often reveals more than a hundred text responses. Look for patterns: are multiple low-rating users hitting the same error? Getting lost on the same page?
- Create a saved search using the Survey Response filter for "rating less than 3" so you can revisit low-satisfaction sessions regularly without re-filtering each time.
- Track improvement by shipping a fix and watching the average rating in your timeseries chart. If the fix worked, you'll see the score climb in the following weeks.
Frequently asked questions
Can I edit a survey while it's active? Yes, surveys can be edited while active.
What happens to responses if I change a question? Changing a question's type will cause previous responses for that question to no longer appear in results. If you need to change how a question works, consider disabling the old question and adding a new one instead so historical data is preserved.
Can I reorder questions after the survey has started collecting responses? Yes, and responses will not be lost.
How often can a user see the same survey? Once a user completes or closes a survey, they will not see it again — even in future sessions.
Can I run multiple surveys at the same time? Yes. You can have multiple active surveys simultaneously. If a user qualifies for more than one survey the surveys will be staggered.
What are the limits on the Free plan? On the Free plan, you can have up to 3 active surveys and collect up to 100 responses per survey. All question types and templates are available. The "Powered by LogRocket" watermark appears on free-plan surveys — upgrade to Pro to remove it.
Is there a maximum number of surveys I can create? No.
Can I use surveys on mobile apps? Coming soon!
How do I remove the "Powered by LogRocket" watermark? Upgrade to a Pro plan. The watermark is automatically removed for paid accounts. See Creating Surveys for details.
Where do survey responses appear in Feedback? Survey responses automatically flow into Feedback, where they're combined with other qualitative data sources like support tickets and app store reviews.
Can I target surveys based on custom user properties? Yes, when you have configured user traits via the .identify() SDK function, these properties will be available in "Audience Targeting".
What happens if I deactivate a survey — do I lose the data? No. Deactivating a survey stops it from appearing to new users, but all existing response data, analytics, and session replay links are preserved. You can reactivate the survey at any time.
Can I duplicate a survey and modify the copy? Yes. Use the Duplicate action from the survey list to create an exact copy with all questions, logic, design, and targeting. The copy starts in an Inactive state with zero responses, so you can modify it without affecting the original. See [[Creating Surveys#Duplicating a survey]].
Does the survey widget conflict with other on-page widgets (chat, help, etc.)? If your app uses another widget for something like a help desk or support chat, you will want to put the survey widget in a different corner to avoid the widgets overlapping.
What happens if a user starts a survey but doesn't finish? If a user starts a survey but doesn't finish, we save the questions they submitted.
Updated about 2 hours ago
