Skip to main content
Follow these guidelines to get the most actionable results from your PickFu surveys.

Writing good questions

Be specific

Tell respondents exactly what you want them to evaluate.
“Which product image would make you more likely to click on an Amazon listing?”

Provide context

Give respondents enough background to make an informed choice.
“We’re designing a logo for a children’s educational app. Which logo design feels more trustworthy and fun?”

Keep it focused

Ask one question at a time. If you need feedback on multiple aspects, use a multi-question survey (up to 8 questions) so the same respondents answer all of them, or create separate polls.
“Which headline would make you want to learn more about this product?”

Choosing the right question type

GoalRecommended type
Compare two options directlyHead-to-head
Rank multiple options by preferenceRanked
Get detailed feedback on one conceptOpen-ended
See where people look on an imageClick test
Test first impressionsFive-second test
Measure overall satisfactionStar rating
Pick one winner from several optionsSingle select
Find out which features or attributes appeal mostMulti-select

Targeting your audience

Match your actual customers

Target demographics that reflect your real customer base. If you sell products on Amazon, target Amazon Prime members. If your audience skews younger, set appropriate age ranges.

Start broad, then narrow

If you’re unsure about targeting:
  1. Run a first poll with broad targeting to get general feedback
  2. Review the demographic breakdowns in your results
  3. Run follow-up polls with tighter targeting based on what you learned

Consider sample size

  • 15 respondents - Good enough for quick directional feedback
  • 30 respondents - Useful for early-stage validation with a bit more signal
  • 50 respondents - Solid for most decisions
  • 75 respondents - Good balance between confidence and speed
  • 100+ respondents - Use when you need demographic breakdowns or high confidence

Preparing your options

Use consistent formatting

Keep all options at the same quality level and format. If one image is high-resolution and another is a rough sketch, respondents will judge quality rather than the actual concept.

Test real alternatives

Use options that represent genuine choices you’re deciding between, not obviously good vs. obviously bad options.

Limit the number of options

  • 2 options - Use head-to-head for clear A/B decisions
  • 3-5 options - Best for ranked or single-select polls
  • 6-8 options - Use only when all options are genuinely viable

Interpreting results

Look beyond the winner

The written explanations often contain more valuable insight than the vote counts. Read them to understand why people chose what they chose.

Check demographic breakdowns

Different audience segments may have different preferences. A result that’s split 50/50 overall might be 80/20 within your target demographic.

Consider statistical significance

  • Strong signal - 70%+ of respondents agree
  • Moderate signal - 55-70% agree, consider adding more respondents to increase confidence
  • Weak signal - Close to 50/50, the options may be equally viable

Run iterative tests

Use initial poll results to refine your options, then run another poll. Two rounds of 50-respondent polls often yield better insights than one round of 100.

Common mistakes to avoid

These patterns reduce the quality of your results.
  • Leading questions - “Don’t you think Option A looks more professional?” biases respondents
  • Too many variables - Comparing options that differ in multiple ways makes it hard to know what drove the preference
  • Mismatched formats - Mixing high-quality images with rough mockups skews results toward production quality rather than concept preference
  • Ignoring written feedback - Quantitative votes tell you what won; written explanations tell you why
  • Testing too late - Run polls early in the design process when changes are still easy to make