[toc-this]
Please see also pages on [link title="deciding%20to%20use%20a%20survey" link="%2Faware%2FPA%2FPages%2FPlanning%2FDeciding-using-survey.aspx" /]
and [link title="analysing%20survey%20data" link="%2Faware%2FPA%2FPages%2FExamination%2FAnalysis-survey.aspx" /]
.
Principles
We plan and set up our surveys with the objective of obtaining relevant and reliable results at a proportionate cost. We manage the implementation of surveys proactively and address any problems as they arise.
We prepare the survey in a way that:
- the questions are relevant for the purpose of the survey;
- the questions are clear and easily understood;
- the questions have a good chance of being replied to reliably; and
- the responses can be analysed to provide a meaningful result.
We bear in mind that a short, simple survey is more likely to be responded to than a long and complicated one.
We test the questions before circulating the survey to help ensure it will be answered as intended, and mitigate any problems that might arise.
Instructions
Preparing the questionnaire
Devising a suitable questionnaire is an iterative process which may take several rounds of review (depending on the complexity and the length), and should involve the audit team, audit management and, if applicable, poicy experts. You can also seek advice from the ECA’s survey experts.
Think of the respondent when structuring the questionnaire, in order to facilitate the answering process. Group the questions logically in thematic sections. Introduce each section with a heading and short description, setting out the context.
Begin the questionnaire with simple questions that respondents will find interesting and engaging. As far as possible ensure the opening questions are applicable to all respondents. Present the most important questions at the beginning and middle of the questionnaire, as those at the end may be responded to less reliably. Use active ‘yes/no’ filters to present only those questions that are relevant to the respondent.
Keep your questionnaire as short as possible, ideally requiring no more than 20 minutes to complete. It can be helpful to indicate the expected completion time at the beginning of the survey. For longer questionnaires ensure that the respondent has the possibility to complete the survey in stages, should they be interrupted.
Consider the sensitivity of the topic and whether the survey responses should be anonymous or can be nominative. Remember that this may have an influence on the way the questions are responded to, and any [link title="follow up" link="#link-to-section-in-this-page"] given to the survey.
Elaborate your topics of interest
Topics of interest differ in their complexity. For simple topics (e.g. gender of training participants) a single question can suffice. In contrast, do not address a complex topic of interest (e.g. administrative burden, innovation, deadweight, effectiveness of a system) through a single question only. Instead, break the topic down into component parts and use a series of questions. Consider covering different aspects of the topic through complementary questions.
[toggles]
[toggle title="Example%20of%20addressing%20a%20complex%20topic"]
The audit team envisaged using a survey to collect data on whether a programme targeted businesses with high potential for innovation. Innovation is a concept that can be understood differently in different contexts, and is partially a relative rather than absolute phenomenon. Asking directly the respondents to assess their degree of innovation bears the risk of overconfidence bias (e.g. a project that a person considers innovative for them might be standard practice for others). Therefore, the team identified the indicator variables that, taken together, provide a more reliable measurement of innovation:
- uniqueness of the company’s product(s), limited number of competitors; and
- innovation activity within the company (e.g. patents held, research and innovation budget and personnel, new products launched in recent years).
The team then proceeded to identify for each component suitable question(s). The assessment of the degree of innovativeness of the company was then made by the team taking into consideration the answers to the questions.
[/toggle]
[/toggles]
Drafting the questions
Develop clear and understandable questions, which are relevant to the topic of the survey and can be replied to in a meaningful way. Each question should have a clear objective and [link title="purpose" link="#First-page/Purpose"] in relation to the results being sought.
Use simple and understandable words
Use simple words wherever possible. In certain circumstances it may be unavoidable to use specialist terminology or foreign words, but explain these if the survey is addressed to non-experts. Use words and concepts consistently, and consider explaining those that may not be readily understood.
Be brief and precise
Avoid long, complex questions, as they may not be read fully and or misunderstood, meaning that the responses may not be comparable.
Avoid double negatives
Questions that involve a double negative can be difficult to understand, which can lead to unreliable answers.
Do not demand too much from respondents
Only ask questions respondents are likely to be able to respond to reliably, or request information that they can obtain easily. If respondents are required to provide an answer that involves a calculation (such as percentages or average values), try to make it as easy as possible by giving examples or even a calculation tool. When figures and calculations are involved ask the respondents to confirm the source of the figures they provide when there are different possibilities, such as audited financial statements, management accounts, estimates etc.
One issue per question
If you are addressing a complex matter, then consider separating the subject into short questions that cover one issue each.
[toggles]
[toggle title="Example"]
“Did you use external consultancy services for the EU-subsidised investment? Yes/no.” If yes, “Were there costs you paid by yourself, that were not reimbursed? Yes/No”, “What was the amount of the cost (excluding VAT) you paid yourself? Please specify the currency.”,
not: “Have you paid for external consultancy services for the EU-subsidized investment, and what was the amount of costs not reimbursed (in euro, without VAT)? If other currency, please specify.
[/toggle]
[/toggles]
Word questions in a neutral way
Do not phrase questions with a predetermined response in mind. Whenever possible, use scales for answer options, instead of a binary agree/disagree, since the latter tends to elicit higher rates of agreement.
Avoid asking about hypothetical situations
Avoid asking questions about hypothetical situations, particularly when they are far removed from the experience and situation of the respondent, as the resulting answers may be insufficiently robust to be used as [link title="evidence" link="%2Faware%2FGAP%2FPages%2FAudit-evidence.aspx" /]
. If you intend to ask about a hypothetical situation to which the respondent should be able to provide a meaningful response, consider also asking complementary questions that could reveal the real understanding of the respondent of the hypothetical action or view.
[toggles]
[toggle title="Example"]
Ask concrete questions such as:
“Have you considered an expansion abroad?”
“Did you engage with the support services?” and “If yes, how would you rate their services?”
Do not ask hypothetical questions such as:
“If you wanted to expand your business abroad, would you use the services of the EU SME centres?”, “If yes, how do you rate the support services?“, “If no, what were the reasons?”
[/toggle]
[/toggles]
Be precise when referring to time periods
To help ensure accurate and comparable results, where appropriate questions should relate to a clearly defined period or point in time.
Have exhaustive non-overlapping well-balanced response categories
Use predefined response categories to: make it easier for respondents to reply; improve the reliability of the replies by showing the different responses possible; and facilitate compiling and classifying the results. The challenge is to define appropriate response categories that are relevant to what is being asked, cover as many potential answers as possible and make them mutually exclusive to prevent ambiguity. A “residuals” category such as ‘other – please specify’, ‘none of these’, ‘do not know’ should be included if there could be other options, to help reduce inaccurate replies.
If you ask respondents to rate an issue, use a five-point scale (or a seven-point scale) with fully verbalised naming or numbering and a middle category present. Do not use negative numerical labels. As far as possible use the same rating scale throughout the questionnaire to avoid confusing the respondent and to facilitate the subsequent analysis and presentation of the results.
Include explanations when needed
Some questions may require brief explanations to ensure the respondent fully understands them.
Choose the right measurement level
Some questions involve a graded or measured reply, for example the extent to which the respondent agrees with a given assertion. The type of question, and possible responses, will determine the level of grading that is appropriate, and therefore the precision of measurement that is possible. The granularity involved will determine the aware"] you can conduct with the responses. A more granular level – provided this doesn’t introduce false precision – will allow more statistical procedures to be applied. More granular results can be presented with less precision, but not the other way around. Also, questions requiring more precision in the replies may discourage the respondent (e.g. respondents will be more inclined to indicate their income or age in the form of ranges, rather than the providing the precise amount or date of birth).
There are four levels of measurement: nominal, ordinal, interval, and ratio. Where applicable, choose the level of measurement that best reflects how you want to use the responses (see examples).
[toggles]
[toggle title="Example"]
In a nominal level variable, values have no order or hierarchy.
Example: “What are the supervisory tasks of the financial supervisor?
- Investor protection activities (insurance, enforcement rules)
- Macro-prudential supervision (monitoring systemic risk)
- Micro-prudential supervision (surveillance of safety and soundness of institutions)
- Recovery and resolution activities
- Other specified tasks”
The statement resulting from this question would be the distribution of frequencies of the tasks of the financial supervisor.
Ordinal level variables have a meaningful order or hierarchy.
Example: “How do you rate your skills in Excel?
- Advanced
- Intermediate
- Basic
- Novice
- Zero”
The statement resulting from this question would be the distribution of frequencies as well as a ranking of the skills level in Excel.
Interval and ratio level variables (also called continuous level variables) have the most detail associated with them. Mathematical operations such as addition, subtraction, multiplication, and division can be accurately applied to the values of these variables. Examples of an interval variable are temperature, time, date, or a score given in a procurement or recruitment procedure. An example of a ratio variable is income in euro. This variable has arithmetic properties and additionally, the difference between the amounts is the same.
Example interval: “How much do you trust the European parliament on a scale from 0 to 10? 0 means you have no trust in the institution at all, and 10 means you have complete trust.
0 - No trust at all
to
10- Complete trust”
Because we measure the differences on a continuum, we can assume differences between the values to be equal. Thus, we can make observation such as the trust in the European Parliament in country A is twice as high as in country B.
Example ratio: “What is the biodiversity-related expenditure per year since 2020?
Biodiversity related expenditure per year is: ”
With this information, it is possible to compare, rank, and describe ratios, rates, intervals.
[/toggle]
[/toggles]
Use open-ended questions sparingly
Including many open-ended questions in a survey risks overburdening the respondents and the resulting responses will be time-consuming for the team to analyse.
However, in certain cases questions with open-ended answers should be used, for example when the complete list of possible answers is unknown.
Open-ended questions can also be used to:
- avoid excessively long lists of response options;
- avoid steering responses through pre-determined categories;
- enquire about sensitive information; or
- express criticism or make comments on specific issues.
In addition, it is always good practice to give respondents the possibility to provide comments at the end of a survey questionnaire.
Bear in mind that open-ended questions will require more resources to [link title="analyse" link="%2Faware" /]
and may result in fewer responses than for closed questions.
Prepare invitation texts
Make the invitation email and welcome page informative, inviting, and attractive, as they will have a direct influence on the extent to which respondents will engage with the survey. Keep the text concise but provide basic information on why the survey is being undertaken, how the results will be used, who it is addressed to, and by when a response is required.
In particular include:
- a short, accurate title;
- the ECA logo;
- an explanation of the purpose of the survey, and how it relates to the audit or review task in question;
- the importance of participation;
- a request for the respondent’s co-operation;
- link to the privacy statement;
- information about any anonymity commitments;
- an indication of the expected time to complete the survey;
- a contact email address in case of questions;
- the deadline;
- information how the results of the work that the survey will be used, and if they will be accessible (including any commitment to provide respondents with a link to the report after publication);
- essential technical instructions necessary to use the tool; and
- a thank you to the respondents.
The deadline to answer the survey will vary depending on the nature of the respondents, and on the complexity of the questions. Generally, a deadline for completion will be around one to maximum two weeks. It is good practice to send the invitation at the beginning of a given week, with a reminder before the deadline. If after the deadline the response rate is insufficient, then consider extending it. Consider contacting non-responding addressees individually, to see if they need any specific support to reply.
[toggles]
[toggle title="Example%20of%20an%20invitation%20email"]
Subject line: ECA survey on Smart Cities- your opinion matters!
Dear Mr XY,
The European Court of Auditors is conducting a performance audit on EU Research and Development in the field of Smart Cities. The audit focuses on the Horizon 2020 Lighthouse programme and its purpose is to formulate recommendations to the European Commission considering present and future initiatives in this area.
We need your view to ensure that our recommendations are useful and help to make EU initiatives more efficient and effective. It will take approximately 10 minutes to complete the questionnaire.
The link below directs you to an online questionnaire made available through the secure website of EUSurvey: https://.... You are kindly asked to submit your reply by XX.
Your individual answers will be held confidentially by the European Court of Auditors and will not be revealed to any other party. Only anonymised and summarised results will be communicated outside the ECA.
If you have any questions, please do not hesitate to contact the audit team at ECA-Smartcities@eca.europa.eu
Thank you very much for your valuable time and cooperation.
The ECA audit team (Personal signature & hierarchy level)
[/toggle]
[/toggles]
[toggles]
[toggle title="Example%20of%20a%20welcome%20page"]
This questionnaire has been sent to all Horizon 2020 Lighthouse project participants. It forms part of a performance audit by the European Court of Auditors (ECA) on "EU Research and Development in the field of Smart Cities".
Your response will be used to formulate recommendations to the European Commission, which will be included in a Special Report to be published at XX by the ECA.
Thank you very much for your time to answer our questions.
For more information on the collection and processing of personal data, please consult the privacy statement (linked on the top right side of the page).
[/toggle]
[/toggles]
Prepare the privacy statement for your survey. Always consult the Data Protection Officer (DPO) and the data protection coordinator (DPC) of your chamber, who will help you draft the statement.
As a general rule, surveys addressed to recipients or managers of EU funds are not anonymous, meaning that the survey participants can be clearly identified. However, we can guarantee to the respondents that only aggregated data will be shared with our principal auditee. If you run the survey in anonymous mode, please do not include any question that could identify respondents, ensure that a respondent can send only one answer and seek the advice of the DPO and the DPC of your Chamber. If your survey includes sections for open questions, please explicitly state within the survey that participants should refrain from including any personal data.
Languages
Choose the language of the survey according to the respondents. Surveys can be conducted in one working language (generally English) if respondents are within the EU institutions and agencies. If the respondents are in the member states, whether they are authorities managing EU funds, beneficiaries, or citizens, the survey normally should be provided in their official languages, unless you agree with the respondents beforehand that this is not needed. If the questionnaire is translated, it is good practice to test each language version separately.
Whether the survey involves all or only a sample of respondents, it is necessary to obtain an up-to-date and complete list from the auditee. Once the selection is made, obtain the contact details needed for distributing the survey from the auditee, if not already available.
Remember that contact details are personal data, and that you should delete from your records once you have completed the survey.
Obtain feedback on the proposed survey
Offer the auditee the possibility to review and comment on the proposed questionnaire and take account of their feedback as appropriate. Consider extending this consultation to the supreme audit institution of the member state(s) concerned if this could give useful insight.
Testing
Test the questionnaire with a small number of respondents before you run the full-scale survey. This can help determine if they understand the questions and answer categories and are able to respond accurately and reliably. Testing the survey on colleagues is unlikely to give the same level of assurance.
To test the questionnaire:
- select a few respondents to whom to address the test questionnaire. These can be, for example, respondents known to you from past on-the-spot visits, or contacts from the auditee;
- prepare a short letter/email for the test respondent(s) explaining the objective and the process of the test, and explaining that you would wish to interview them about their response;
- prepare and distribute the test questionnaire in the format in which it will be used;
- conduct a follow-up interview with each test respondent to gain their feedback on the extent to which they understood the questions and answers, and any technical problems they encountered. The interview can be focused on those parts of the questionnaire where you expect most problems; and
- analyse the responses and feedback from the interviews, and revise the questionnaire as necessary.
You can request the support of the DQC survey support team for preparing, running and analysing the results of the test.
Running the survey
Once the testing has been completed, introduce the final questionnaire into [link new-window title="EUSurvey" link="https%3A%2F%2Fec.europa.eu%2Feusurvey%2Fauth%2Flogin" icon="external-link" /]
(or other distribution method). Take care with formatting and aim for consistency in presentation. The DQC survey support team can help you in this process.
Launch the survey, and then monitor its progress during the response period. Consider and respond to every question from respondents, and take any corrective action needed as quickly as possible.
Send one or two reminders to the respondents who have not yet responded. Time your reminders to maximise the response rate. Send the first reminder at around the mid-point of the response window, and the second one shortly before the initial deadline. It is good practice to send reminders at the beginning of a week. Extend the deadline if necessary to achieve the desired response rate.
Consider sending thank you emails to respondents after the completion of the survey (The [link new-window title="EUSurvey" link="https%3A%2F%2Fec.europa.eu%2Feusurvey%2Fauth%2Flogin" icon="external-link" /]
tool has this functionality).
Resources
[icons-list icon-size="2" separator="line" vertical-alignment="middle" icon-vertical-alignment="middle"]
[icon-list-item title="EUSurvey" description="is%20the%20recommended%20tool%20for%20online%20surveys%20in%20audits.%20If%20you%20wish%20to%20use%20an%20alternative%20survey%20tool%2C%20it%20should%20be%20covered%20by%20a%20specific%20contract%20with%20the%20supplier%20and%20comply%20with%20GDPR.%20" icon="external-link" link="https%3A%2F%2Fec.europa.eu%2Feusurvey%2F" linking="new-window" /]
[/icons-list]
References
- Bogner, K., Landrock, U. 2016. ”
[link new-window title="Response%20Biases%20in%20Standardised%20Surveys" link="https%3A%2F%2Fwww.gesis.org%2Ffileadmin%2Fupload%2FSDMwiki%2FBognerLandrock_Response_Biases_in_Standardised_Surveys.pdf" icon="external-link" /]
”
- Eurostat “
[link new-window title="Handbook%20of%20recommended%20practices%20for%20questionnaire%20development%20and%20testing%20methods" link="https%3A%2F%2Fec.europa.eu%2Feurostat%2Fdocuments%2F64157%2F4374310%2F13-Handbook-recommended-practices-questionnaire-development-and-testing-methods-2005.pdf%2F52bd85c2-2dc5-44ad-8f5d-0c6ccb2c55a0" icon="external-link" /]
”
- Koch, A., & Blohm, M. 2016. “
[link new-window title="Nonresponse%20Bias.%20GESIS%20Survey%20Guidelines" link="https%3A%2F%2Fwww.gesis.org%2Ffileadmin%2Fupload%2FSDMwiki%2FNonresponse_Bias_Koch_Blohm_08102015_1.1.pdf" icon="external-link" /]
”
- Menold, K., Bogner, K. 2016. ”
[link new-window title="Design%20of%20Rating%20Scales%20in%20Questionnaires" link="https%3A%2F%2Fwww.gesis.org%2Ffileadmin%2Fupload%2FSDMwiki%2FRatingskalen_MenoldBogner_08102015_1.1.pdf" icon="external-link" /]
”
- Book: “
[link new-window title="Mail%20and%20internet%20surveys%3A%20the%20tailored%20design%20method." link="https%3A%2F%2Fwww.researchgate.net%2Fpublication%2F230557511_Mail_and_Internet_Surveys_The_Tailored_Design_Method" icon="external-link" /]
”, Dillman, D. 2000; ISBN: 978-0471323549
- Book: “
[link new-window title="Methods%20for%20Testing%20and%20Evaluating%20Survey%20Questionnaires" link="https%3A%2F%2Fonlinelibrary.wiley.com%2Fdoi%2Fbook%2F10.1002%2F0471654728" icon="external-link" /]
”, Presser, S. et.al. 2004; Print ISBN-13:978-0471458418, Online ISBN:978-0471654728
- Book: “
[link new-window title="Was%20ist%20eine%20gute%20Frage%3F" link="https%3A%2F%2Flink.springer.com%2Fbook%2F10.1007%2F978-3-531-91441-1" icon="external-link" /]
” Pruefer, R., et.al. 2009, ISBN-13: :978-3531199146. Abstract is available [link new-window title="here" link="https%3A%2F%2Fwww.google.de%2Fbooks%2Fedition%2FWas_ist_eine_gute_Frage%2FFjPdJFnR2HEC%3Fhl%3Dde%26gbpv%3D1" icon="external-link" /]
.
- Book: “
[link new-window title="The%20Psychology%20of%20Survey%20Response" link="https%3A%2F%2Fwww.cambridge.org%2Fcore%2Fbooks%2Fpsychology-of-survey-response%2F46DE3D6F7C1399BCDC78D9441C630372" icon="external-link" /]
.” Tourangeau, R., Rips, L. J., Rasinski, K. 2000, ISBN-13:
978-0511819322
- Book: “
[link new-window title="Cognitive%20Interviewing%20%E2%80%93%20A%20Tool%20for%20Improving%20Questionnaire%20Design" link="https%3A%2F%2Fus.sagepub.com%2Fen-us%2Fnam%2Fcognitive-interviewing%2Fbook225856" icon="external-link" /]
”, Willis, G. B. 2005, IBSN-13: 978-0761928041
[/toc-this]