This is “Designing Effective Questions and Questionnaires”, section 8.4 from the book Sociological Inquiry Principles: Qualitative and Quantitative Methods (v. 1.0). For details on it (including licensing), click here.
For more information on the source of this book, or why it is available for free, please see the project's home page. You can browse or download additional books there. To download a .zip file containing this book to use offline, simply click here.
To this point we’ve considered several general points about surveys including when to use them, some of their pros and cons, and how often and in what ways to administer surveys. In this section we’ll get more specific and take a look at how to pose understandable questions that will yield useable data and how to present those questions on your questionnaire.
The first thing you need to do in order to write effective survey questions is identify what exactly it is that you wish to know. As silly as it sounds to state what seems so completely obvious, I can’t stress enough how easy it is to forget to include important questions when designing a survey. Let’s say you want to understand how students at your school made the transition from high school to college. Perhaps you wish to identify which students were comparatively more or less successful in this transition and which factors contributed to students’ success or lack thereof. To understand which factors shaped successful students’ transitions to college, you’ll need to include questions in your survey about all the possible factors that could contribute. Consulting the literature on the topic will certainly help, but you should also take the time to do some brainstorming on your own and to talk with others about what they think may be important in the transition to college. Perhaps time or space limitations won’t allow you to include every single item you’ve come up with, so you’ll also need to think about ranking your questions so that you can be sure to include those that you view as most important.
Figure 8.7
To understand which factors shape successful students’ transitions to college, survey questions should take into consideration all the possible factors that could contribute.
© Thinkstock
Although I have stressed the importance of including questions on all topics you view as important to your overall research question, you don’t want to take an everything-but-the-kitchen-sink approach by uncritically including every possible question that occurs to you. Doing so puts an unnecessary burden on your survey respondents. Remember that you have asked your respondents to give you their time and attention and to take care in responding to your questions; show them your respect by only asking questions that you view as important.
Once you’ve identified all the topics about which you’d like to ask questions, you’ll need to actually write those questions. Questions should be as clear and to the point as possible. This is not the time to show off your creative writing skills; a survey is a technical instrument and should be written in a way that is as direct and succinct as possible. As I’ve said, your survey respondents have agreed to give their time and attention to your survey. The best way to show your appreciation for their time is to not waste it. Ensuring that your questions are clear and not overly wordy will go a long way toward showing your respondents the gratitude they deserve.
Related to the point about not wasting respondents’ time, make sure that every question you pose will be relevant to every person you ask to complete it. This means two things: first, that respondents have knowledge about whatever topic you are asking them about, and second, that respondents have experience with whatever events, behaviors, or feelings you are asking them to report. You probably wouldn’t want to ask a sample of 18-year-old respondents, for example, how they would have advised President Reagan to proceed when news of the United States’ sale of weapons to Iran broke in the mid-1980s. For one thing, few 18-year-olds are likely to have any clue about how to advise a president (nor does this 30-something-year-old). Furthermore, the 18-year-olds of today were not even alive during Reagan’s presidency, so they have had no experience with the event about which they are being questioned. In our example of the transition to college, heeding the criterion of relevance would mean that respondents must understand what exactly you mean by “transition to college” if you are going to use that phrase in your survey and that respondents must have actually experienced the transition to college themselves.
If you decide that you do wish to pose some questions about matters with which only a portion of respondents will have had experience, it may be appropriate to introduce a filter questionA question designed to identify some subset of survey respondents who are then asked additional questions that are not relevant to the entire sample. into your survey. A filter question is designed to identify some subset of survey respondents who are asked additional questions that are not relevant to the entire sample. Perhaps in your survey on the transition to college you want to know whether substance use plays any role in students’ transitions. You may ask students how often they drank during their first semester of college. But this assumes that all students drank. Certainly some may have abstained, and it wouldn’t make any sense to ask the nondrinkers how often they drank. Nevertheless, it seems reasonable that drinking frequency may have an impact on someone’s transition to college, so it is probably worth asking this question even if doing so violates the rule of relevance for some respondents. This is just the sort of instance when a filter question would be appropriate. You may pose the question as it is presented in Figure 8.8 "Filter Question".
Figure 8.8 Filter Question
There are some ways of asking questions that are bound to confuse a good many survey respondents. Survey researchers should take great care to avoid these kinds of questions. These include questions that pose double negatives, those that use confusing or culturally specific terms, and those that ask more than one question but are posed as a single question. Any time respondents are forced to decipher questions that utilize two forms of negation, confusion is bound to ensue. Taking the previous question about drinking as our example, what if we had instead asked, “Did you not drink during your first semester of college?” A response of no would mean that the respondent did actually drink—he or she did not not drink. This example is obvious, but hopefully it drives home the point to be careful about question wording so that respondents are not asked to decipher double negatives. In general, avoiding negative terms in your question wording will help to increase respondent understanding.Though this is generally true, some researchers argue that negatively worded questions should be integrated with positively worded questions in order to ensure that respondents have actually carefully read each question. See, for example, the following: Vaterlaus, M., & Higgenbotham, B. (2011). Writing survey questions for local program evaluations. Retrieved from http://extension.usu.edu/files/publications/publication/FC_Evaluation_2011-02pr.pdf
You should also avoid using terms or phrases that may be regionally or culturally specific (unless you are absolutely certain all your respondents come from the region or culture whose terms you are using). When I first moved to Maine from Minnesota, I was totally confused every time I heard someone use the word wicked. This term has totally different meanings across different regions of the country. I’d come from an area that understood the term wicked to be associated with evil. In my new home, however, wicked is used simply to put emphasis on whatever it is that you’re talking about. So if this chapter is extremely interesting to you, if you live in Maine you might say that it is “wicked interesting.” If you hate this chapter and you live in Minnesota, perhaps you’d describe the chapter simply as wicked. I once overheard one student tell another that his new girlfriend was “wicked athletic.” At the time I thought this meant he’d found a woman who used her athleticism for evil purposes. I’ve come to understand, however, that this woman is probably just exceptionally athletic. While wicked may not be a term you’re likely to use in a survey, the point is to be thoughtful and cautious about whatever terminology you do use.
Asking multiple questions as though they are a single question can also be terribly confusing for survey respondents. There’s a specific term for this sort of question; it is called a double-barreled questionA question that is posed as a single question but in fact asks more than one question.. Using our example of the transition to college, Figure 8.9 "Double-Barreled Question" shows a double-barreled question.
Figure 8.9 Double-Barreled Question
Do you see what makes the question double-barreled? How would someone respond if they felt their college classes were more demanding but also more boring than their high school classes? Or less demanding but more interesting? Because the question combines “demanding” and “interesting,” there is no way to respond yes to one criterion but no to the other.
Another thing to avoid when constructing survey questions is the problem of social desirabilityThe idea that respondents will try to answer questions in a way that will present them in a favorable light.. We all want to look good, right? And we all probably know the politically correct response to a variety of questions whether we agree with the politically correct response or not. In survey research, social desirability refers to the idea that respondents will try to answer questions in a way that will present them in a favorable light. Perhaps we decide that to understand the transition to college, we need to know whether respondents ever cheated on an exam in high school or college. We all know that cheating on exams is generally frowned upon (at least I hope we all know this). So it may be difficult to get people to admit to cheating on a survey. But if you can guarantee respondents’ confidentiality, or even better, their anonymity, chances are much better that they will be honest about having engaged in this socially undesirable behavior. Another way to avoid problems of social desirability is to try to phrase difficult questions in the most benign way possible. Earl Babbie (2010)Babbie, E. (2010). The practice of social research (12th ed.). Belmont, CA: Wadsworth. offers a useful suggestion for helping you do this—simply imagine how you would feel responding to your survey questions. If you would be uncomfortable, chances are others would as well.
Finally, it is important to get feedback on your survey questions from as many people as possible, especially people who are like those in your sample. Now is not the time to be shy. Ask your friends for help, ask your mentors for feedback, ask your family to take a look at your survey as well. The more feedback you can get on your survey questions, the better the chances that you will come up with a set of questions that are understandable to a wide variety of people and, most importantly, to those in your sample.
In sum, in order to pose effective survey questions, researchers should do the following:
While posing clear and understandable questions in your survey is certainly important, so, too, is providing respondents with unambiguous response optionsThe answers that are provided to for each question in a survey.. Response options are the answers that you provide to the people taking your survey. Generally respondents will be asked to choose a single (or best) response to each question you pose, though certainly it makes sense in some cases to instruct respondents to choose multiple response options. One caution to keep in mind when accepting multiple responses to a single question, however, is that doing so may add complexity when it comes to tallying and analyzing your survey results.
Offering response options assumes that your questions will be closed-ended questionsA survey question for which the researcher provides respondents with a limited set of clear response options.. In a quantitative written survey, which is the type of survey we’ve been discussing here, chances are good that most if not all your questions will be closed ended. This means that you, the researcher, will provide respondents with a limited set of options for their responses. To write an effective closed-ended question, there are a couple of guidelines worth following. First, be sure that your response options are mutually exclusive. Look back at Figure 8.8 "Filter Question", which contains questions about how often and how many drinks respondents consumed. Do you notice that there are no overlapping categories in the response options for these questions? This is another one of those points about question construction that seems fairly obvious but that can be easily overlooked. Response options should also be exhaustive. In other words, every possible response should be covered in the set of response options that you provide. For example, note that in question 10a in Figure 8.8 "Filter Question" we have covered all possibilities—those who drank, say, an average of once per month can choose the first response option (“less than one time per week”) while those who drank multiple times a day each day of the week can choose the last response option (“7+”). All the possibilities in between these two extremes are covered by the middle three response options.
Surveys need not be limited to closed-ended questions. Sometimes survey researchers include open-ended questionsA survey question for which the researcher does not provide respondents with response options; instead, respondents answer in their own words. in their survey instruments as a way to gather additional details from respondents. An open-ended question does not include response options; instead, respondents are asked to reply to the question in their own way, using their own words. These questions are generally used to find out more about a survey participant’s experiences or feelings about whatever they are being asked to report in the survey. If, for example, a survey includes closed-ended questions asking respondents to report on their involvement in extracurricular activities during college, an open-ended question could ask respondents why they participated in those activities or what they gained from their participation. While responses to such questions may also be captured using a closed-ended format, allowing participants to share some of their responses in their own words can make the experience of completing the survey more satisfying to respondents and can also reveal new motivations or explanations that had not occurred to the researcher.
In Section 8.4.1 "Asking Effective Questions" we discussed double-barreled questions, but response options can also be double barreled, and this should be avoided. Figure 8.10 "Double-Barreled Response Options" is an example of a question that uses double-barreled response options.
Figure 8.10 Double-Barreled Response Options
Other things to avoid when it comes to response options include fence-sitting and floating. Fence-sittersRespondents who present themselves as neutral when in truth they have an opinion. are respondents who choose neutral response options, even if they have an opinion. This can occur if respondents are given, say, five rank-ordered response options, such as strongly agree, agree, no opinion, disagree, and strongly disagree. Some people will be drawn to respond “no opinion” even if they have an opinion, particularly if their true opinion is the nonsocially desirable opinion. FloatersRespondents who choose a substantive answer to a question when in truth they don’t understand the question or the response options., on the other hand, are those that choose a substantive answer to a question when really they don’t understand the question or don’t have an opinion. If a respondent is only given four rank-ordered response options, such as strongly agree, agree, disagree, and strongly disagree, those who have no opinion have no choice but to select a response that suggests they have an opinion.
As you can see, floating is the flip side of fence-sitting. Thus the solution to one problem is often the cause of the other. How you decide which approach to take depends on the goals of your research. Sometimes researchers actually want to learn something about people who claim to have no opinion. In this case, allowing for fence-sitting would be necessary. Other times researchers feel confident their respondents will all be familiar with every topic in their survey. In this case, perhaps it is OK to force respondents to choose an opinion. There is no always-correct solution to either problem.
Finally, using a matrix is a nice way of streamlining response options. A matrixQuestion type that that lists a set of questions for which the answer categories are all the same. is a question type that that lists a set of questions for which the answer categories are all the same. If you have a set of questions for which the response options are the same, it may make sense to create a matrix rather than posing each question and its response options individually. Not only will this save you some space in your survey but it will also help respondents progress through your survey more easily. A sample matrix can be seen in Figure 8.11 "Survey Questions Utilizing Matrix Format".
Figure 8.11 Survey Questions Utilizing Matrix Format
In addition to constructing quality questions and posing clear response options, you’ll also need to think about how to present your written questions and response options to survey respondents. Questions are presented on a questionnaireThe document (either hard copy or online) that contains survey questions on which respondents read and mark their responses., the document (either hard copy or online) that contains all your survey questions that respondents read and mark their responses on. Designing questionnaires takes some thought, and in this section we’ll discuss the sorts of things you should think about as you prepare to present your well-constructed survey questions on a questionnaire.
One of the first things to do once you’ve come up with a set of survey questions you feel confident about is to group those questions thematically. In our example of the transition to college, perhaps we’d have a few questions asking about study habits, others focused on friendships, and still others on exercise and eating habits. Those may be the themes around which we organize our questions. Or perhaps it would make more sense to present any questions we had about precollege life and habits and then present a series of questions about life after beginning college. The point here is to be deliberate about how you present your questions to respondents.
Once you have grouped similar questions together, you’ll need to think about the order in which to present those question groups. Most survey researchers agree that it is best to begin a survey with questions that will want to make respondents continue (Babbie, 2010; Dillman, 2000; Neuman, 2003).Babbie, E. (2010). The practice of social research (12th ed.). Belmont, CA: Wadsworth; Dillman, D. A. (2000). Mail and Internet surveys: The tailored design method (2nd ed.). New York, NY: Wiley; Neuman, W. L. (2003). Social research methods: Qualitative and quantitative approaches (5th ed.). Boston, MA: Pearson. In other words, don’t bore respondents, but don’t scare them away either. There’s some disagreement over where on a survey to place demographic questions such as those about a person’s age, gender, and race. On the one hand, placing them at the beginning of the questionnaire may lead respondents to think the survey is boring, unimportant, and not something they want to bother completing. On the other hand, if your survey deals with some very sensitive or difficult topic, such as child sexual abuse or other criminal activity, you don’t want to scare respondents away or shock them by beginning with your most intrusive questions.
In truth, the order in which you present questions on a survey is best determined by the unique characteristics of your research—only you, the researcher, hopefully in consultation with people who are willing to provide you with feedback, can determine how best to order your questions. To do so, think about the unique characteristics of your topic, your questions, and most importantly, your sample. Keeping in mind the characteristics and needs of the people you will ask to complete your survey should help guide you as you determine the most appropriate order in which to present your questions.
You’ll also need to consider the time it will take respondents to complete your questionnaire. Surveys vary in length, from just a page or two to a dozen or more pages, which means they also vary in the time it takes to complete them. How long to make your survey depends on several factors. First, what is it that you wish to know? Wanting to understand how grades vary by gender and year in school certainly requires fewer questions than wanting to know how people’s experiences in college are shaped by demographic characteristics, college attended, housing situation, family background, college major, friendship networks, and extracurricular activities. Keep in mind that even if your research question requires a good number of questions be included in your questionnaire, do your best to keep the questionnaire as brief as possible. Any hint that you’ve thrown in a bunch of useless questions just for the sake of throwing them in will turn off respondents and may make them not want to complete your survey.
Second, and perhaps more important, how long are respondents likely to be willing to spend completing your questionnaire? If you are studying college students, asking them to use their precious fun time away from studying to complete your survey may mean they won’t want to spend more than a few minutes on it. But if you have the endorsement of a professor who is willing to allow you to administer your survey in class, students may be willing to give you a little more time (though perhaps the professor will not). The time that survey researchers ask respondents to spend on questionnaires varies greatly. Some advise that surveys should not take longer than about 15 minutes to complete (cited in Babbie 2010),This can be found at http://www.worldopinion.com/the_frame/frame4.html, cited in Babbie, E. (2010). The practice of social research (12th ed.). Belmont, CA: Wadsworth. others suggest that up to 20 minutes is acceptable (Hopper, 2010).Hopper, J. (2010). How long should a survey be? Retrieved from http://www.verstaresearch.com/blog/how-long-should-a-survey-be As with question order, there is no clear-cut, always-correct answer about questionnaire length. The unique characteristics of your study and your sample should be considered in order to determine how long to make your questionnaire.
A good way to estimate the time it will take respondents to complete your questionnaire is through pretestingGetting feedback on a questionnaire so that it can be improved before it is administered.. Pretesting allows you to get feedback on your questionnaire so you can improve it before you actually administer it. Pretesting can be quite expensive and time consuming if you wish to test your questionnaire on a large sample of people who very much resemble the sample to whom you will eventually administer the finalized version of your questionnaire. But you can learn a lot and make great improvements to your questionnaire simply by pretesting with a small number of people to whom you have easy access (perhaps you have a few friends who owe you a favor). By pretesting your questionnaire you can find out how understandable your questions are, get feedback on question wording and order, find out whether any of your questions are exceptionally boring or offensive, and learn whether there are places where you should have included filter questions, to name just a few of the benefits of pretesting. You can also time pretesters as they take your survey. Ask them to complete the survey as though they were actually members of your sample. This will give you a good idea about what sort of time estimate to provide respondents when it comes time to actually administer your survey, and about whether you have some wiggle room to add additional items or need to cut a few items.
Perhaps this goes without saying, but your questionnaire should also be attractive. A messy presentation style can confuse respondents or, at the very least, annoy them. Be brief, to the point, and as clear as possible. Avoid cramming too much into a single page, make your font size readable (at least 12 point), leave a reasonable amount of space between items, and make sure all instructions are exceptionally clear. Think about books, documents, articles, or web pages that you have read yourself—which were relatively easy to read and easy on the eyes and why? Try to mimic those features in the presentation of your survey questions.