Back to Main articles page.
design - How to ensure your reports deliver what you
the series: Improving
Survey Effectiveness -
And How To Avoid
Part: Four Of Five
Written By: Paul Quinn, Managing
Director of Quinntessential Marketing. ©
Ever heard the saying:
'Start with the end in mind?' Well it's true in
many areas of life and survey design is no exception.
Too often we see surveys where scant consideration has
been given to the purpose of the questionnaire: people
simply go off and create surveys with little or no thought
to what questions they are trying to answer and consequently
the format and content of the final reports don't deliver.
And of course once the survey is closed off and responses
collected there's nothing you can do about poorly worded
questions or answer options that don't give you the
actionable data you need to make solid, well informed
In this article we'll share
some ideas to help ensure you construct your survey
questionnaires to give you the data you need.
1. Clearly define your
survey objectives before writing your questions:
Sounds like commonsense,
but unfortunately many people skip this stage and end
up with un-focused surveys that produce data of limited
use. Here are some key questions to ask yourself at
the outset of your survey project:
- What do you already
know, and what exactly are you trying to find out?
- What decisions do you
need to make as a result of the survey?
- What are the key metrics
you need to gauge and measure?
- How do you envisage
your completed reports will look? What will they contain?
- How will you know if
the survey project has been a success?
2. Consider the exact analysis you will need to conduct:
Once you've completed your draft
questionnaire, take a moment to imagine what you will see
once people have completed the survey. If 32% of your staff
scored the organisation 6 out of 10 for your question regarding
'Training and Development Opportunities', what decisions will
you be able to make as a result of this information? If that's
not obvious, consider how you could re-word the question or
change the rating scale to give you something more useful
It can sometimes also be worth
launching a duplicate version of your survey before the final
version is sent out. Complete a dummy response (or 3) in this
test survey and then view your reports to allow you to see
exactly what your final results are going to look like. Far
easier to change your survey content to make your reports
more meaningful during the 'testing' phase than attempting
to make changes after the 'real' survey is live.
Consider too who will be asking
you for information and what types of information will they
want. For example, if you are about to run a customer survey
and can foresee in advance that your State Managers will ask
for State-specific reporting data, then you need to ensure
your survey is set up to easily allow this analysis. You could
either pre-load location data into your invite file for upload
calls our 'Preloaded Filter' feature) or include these questions
in the body of the questionnaire. The same goes when you want
to dissect your results by gender, department, age, name,
3. Consider your rating scales:
You sometimes hear of clients
who run surveys with 10 point satisfaction scales, and then
when it comes to analysing their results question how to best
interpret and act on the data received. You can hear the puzzled
executive's thoughts now: 'What really is the difference between
a customer scoring us 7 out of 10 as opposed to an 8 out of
10? How can I act on these results?'
In such cases, it often pays
to not only consider a question re-word, but also consider
the answer scales you are using. For example, instead of a
10 point 'Satisfaction' scale, what about a 3 point 'Improvement'
- 3 = Happy - No Improvement
- 2 = Indifferent - Some Improvement
- 1 = Unhappy - Considerable
The benefit of using a scale
such as the one above is the actionable data it delivers.
One could argue that reporting back to the Executive team
that 60% of your customers rated you as needing to make 'Considerable
Improvement' in a specific area of your business is far more
powerful than telling them than 60% of your customers rated
you as a 6 or less out of 10.
It's also important to consider
the answer scales you use when you want to perform calculations
on the results to get a mean, median, etc. For example, if
you want to report on the exact number of employees in each
Division of a business, the following type of answer list
may not give you what you need:
a. Less than 10
e. More than 100
As written, these answer options
don't work if you want to be able to report on the average
number of staff as the basis for a calculation (eg, if someone
selects "More than 100" you don't know the exact
number of staff they have). In this case, you need to ask
for a number in an open-ended question:
Please indicate the current number
of full time employees in your Division: ____
PeoplePulse can also force respondents
to enter a numeric response instead of text (eg. '12' instead
of 'Twelve') to help standardise responses and make data analysis
4. Make every question fight to earn its place in your
Just because you can think of
20 different questions relating to your organisation's training
and development program doesn't mean you should ask them all.
Focus in with laser like precision on the 2 or 3 questions
that you really need to know and that will deliver good actionable
data upon which to make decisions. It's actually easier to
include 20 loosely related questions in one section than to
refine those down to the 2 -3 'money' questions that will
make a real difference - but it's worth making the effort.
Shorter, more focused surveys will reinforce that you value
the time of your respondents and increase the likelihood of
future participation in your research projects.
By defining clear survey objectives
and then giving appropriate attention to your reporting requirements
during the initial questionnaire design stage you significantly
improve your chances of conducting a useful, effective survey
that provides actionable data. A survey that, from a respondent
perspective, is so well thought out that even you would be
willing to take it!
issues in this series:
Final issue in this series:
Australian-built online staff survey tool:
is an Australian built online feedback and survey tool
used extensively by Australian and New Zealand based
organisations to conduct online employee and customer
surveys. The tool can also be used by companies to conduct
cost effective staff climate surveys, training needs
analysis surveys, exit interviews, 'new starter' feedback,
'stay' surveys and a wide range of customer-related
Please complete the form
below to arrange your FREE custom-branded staff or customer
survey demonstration and a PeoplePulse pricing and information
completing the form below, a PeoplePulse representative
will contact you to discuss your needs and current situation.
From there we will set up your demo and arrange a suitable
time to show the system to you:
be assured that your correspondence with us is confidential.
We will not divulge email addresses or any other details you
provide to outside sources.
The above demonstration request form was powered by PeoplePulse.
Thank you for reading
this Article: Questionnaire
design - How to ensure your reports deliver what you need.