Writing good questions
Be specific
Tell respondents exactly what you want them to evaluate.- Good
- Avoid
“Which product image would make you more likely to click on an Amazon listing?”
Provide context
Give respondents enough background to make an informed choice.- Good
- Avoid
“We’re designing a logo for a children’s educational app. Which logo design feels more trustworthy and fun?”
Keep it focused
Ask one question at a time. If you need feedback on multiple aspects, use a multi-question survey (up to 8 questions) so the same respondents answer all of them, or create separate polls.- Good
- Avoid
“Which headline would make you want to learn more about this product?”
Choosing the right question type
| Goal | Recommended type |
|---|---|
| Compare two options directly | Head-to-head |
| Rank multiple options by preference | Ranked |
| Get detailed feedback on one concept | Open-ended |
| See where people look on an image | Click test |
| Test first impressions | Five-second test |
| Measure overall satisfaction | Star rating |
| Pick one winner from several options | Single select |
| Find out which features or attributes appeal most | Multi-select |
Targeting your audience
Match your actual customers
Target demographics that reflect your real customer base. If you sell products on Amazon, target Amazon Prime members. If your audience skews younger, set appropriate age ranges.Start broad, then narrow
If you’re unsure about targeting:- Run a first poll with broad targeting to get general feedback
- Review the demographic breakdowns in your results
- Run follow-up polls with tighter targeting based on what you learned
Consider sample size
- 15 respondents - Good enough for quick directional feedback
- 30 respondents - Useful for early-stage validation with a bit more signal
- 50 respondents - Solid for most decisions
- 75 respondents - Good balance between confidence and speed
- 100+ respondents - Use when you need demographic breakdowns or high confidence
Preparing your options
Use consistent formatting
Keep all options at the same quality level and format. If one image is high-resolution and another is a rough sketch, respondents will judge quality rather than the actual concept.Test real alternatives
Use options that represent genuine choices you’re deciding between, not obviously good vs. obviously bad options.Limit the number of options
- 2 options - Use head-to-head for clear A/B decisions
- 3-5 options - Best for ranked or single-select polls
- 6-8 options - Use only when all options are genuinely viable
Interpreting results
Look beyond the winner
The written explanations often contain more valuable insight than the vote counts. Read them to understand why people chose what they chose.Check demographic breakdowns
Different audience segments may have different preferences. A result that’s split 50/50 overall might be 80/20 within your target demographic.Consider statistical significance
- Strong signal - 70%+ of respondents agree
- Moderate signal - 55-70% agree, consider adding more respondents to increase confidence
- Weak signal - Close to 50/50, the options may be equally viable
Run iterative tests
Use initial poll results to refine your options, then run another poll. Two rounds of 50-respondent polls often yield better insights than one round of 100.Common mistakes to avoid
- Leading questions - “Don’t you think Option A looks more professional?” biases respondents
- Too many variables - Comparing options that differ in multiple ways makes it hard to know what drove the preference
- Mismatched formats - Mixing high-quality images with rough mockups skews results toward production quality rather than concept preference
- Ignoring written feedback - Quantitative votes tell you what won; written explanations tell you why
- Testing too late - Run polls early in the design process when changes are still easy to make
