For the next series of lectures, we're going to be talking about questions. I think the real secret source of writing a good survey is being able to write questions, that actually get at what you want. So, in this particular module, we're just going to get started. We're going to talk about, what do you consider before you start writing questions, what's your starting process here. Writing a good survey questions that really hard task, people commonly underestimate how hard it is to write really good evocative survey questions. Bad questions are one of the most common sources of error that we see in surveys. If you think about it makes a lot of sense. Right? A lot of what we're interested in, are very ephemeral, very kind of abstract ideas, and we're trying to capture those in the amber of a survey question, trying to pin them down so that all people will understand that question and answer it in consistent ways. The rookiest mistake I see with novice surveyors is to write questions first, and it seems like this is a really intuitive thing to do. All of us have answered survey questions for most of our adult lives. So, we feel like we have a good sense of what a survey looks like, and what a good survey questions is, and sometimes we do. For some types of survey questions, it could be that we're very good at knowing what works and what doesn't. However, I'm going to really push you back and say, let's think about what are the other things you can do before you start really buckling down and writing those questions. All right. So, the challenge with writing good survey questions is that, people are very different. Right? People in your population are very heterogeneous, they have lots of different ways they feel about things, they might have very different demographics. So, we want to write a survey question and push it out into the world, and often, especially with web surveys or with mail surveys, you don't have any control of what happens with that survey question once is in the hands of the respondent. So, you need to find questions that every person you ask is going to one be willing to answer, two be able to respond to accurately, and three will interpret in the way that you want them to. So, very vague questions are questions that many people could interpret it different ways, and are not going to help you reach your goal of getting answers. Before you start thinking about writing your questions, there's a whole series of questions you should ask yourself first. Right? Think about again the user experience of the respondent as they're participating in your survey. What is that respondent going to see or hear first? We talked a lot about how do we get people to respond to surveys in the first place. What that first question look like, is it going to be something that they can immediately understand and find compelling, or is it going to be something that chases them away? What happens next? What is your flow look like? What's the narrative of your survey? How does it actually work over time? Much like you would think about with any other user experience problem. Will the respondent think that answering you is important right? Are you signaling with your questions? Will they be able to understand the questions? Will the order of the questions or maybe other responses affect how they answer those? Will they be willing and able to answer these questions that you've asked them? All of these are questions we're going to dive into more in the next few modules, but these are good first places for you to start, when you're thinking about, "What is the user flow of my survey." So, let's break into a couple of more general questions, things that you should ask yourself before you begin writing your survey questions. Right? First is, what concepts do I need to measure. I'm going to dive into this a lot in the next module. So, let's put a pin in this one. The next is, what type of information is the question asking for. Some questions that we have are really easy to answer and some are really hard. Demographics for instance, tend to be pretty easy for us to answer, because we've answered demographics on many many forms and over many surveys throughout our lives. So, in this example you can see that, I have just asked, "How old are you?" In three different ways, using three different types of response formats, open-ended as well as closed ended. With a demographic question is common as this, it doesn't really matter what format you use because people are going to be really easily able to answer the question, because they've answered it so many times, in so many formats. Alternatively however, a question that a person hasn't thought about a lot is going to be much harder for them to answer. So, in this question, I'm asking about civic engagement, how does information technology relate to civic engagement? Now, this is something not a lot of people have thought about a lot. So, you can see here, I have three different options, ordinal scale, a slider, and then a horizontal ordinal scale. Believe it or not, because this is a cognitively tougher question, how I lay out these response categories, and what types of response categories I offer, can lead to very different results, because people are considering what civic engagement even means in this context, and then trying to match that with this question type and the response type. We're going talk more about how do we pick the right question and response types, for the types of questions that we have. But this is a really big deal, we really have to think through, what's the right match between my question, and how I ask that question? Important to remember as you're thinking about these questions of asking people, what they've done or who they are, you have to remember these recall problems that we talked about earlier in this series. Right? Remember that memory fades over time, that it's hard to ask people about things that they've done a year ago, for instance. Individual instances of everyday actions aren't committed to our memory very much, a lot of things we do on automatic. So, if you're talking about how often have you brushed your teeth, we're going to come up with a generalized response to that, but we don't necessarily remember every single time we've done that, because it's such a routine tasks, that we don't commit all of those activities to our long-term memory. Then people don't remember by calendar dates, we remember through context. So, I may not remember necessarily my last traffic ticket, but I remember that it happened at the same time as some other event happened. All right. So, in general people are bad at remembering specific dates. So, just this is a callback to remember all of these problems we have actually with memory, and how memory changes over time. Another important question to consider before you start writing your survey questions is, what survey mode will I be using? Remember that mode refers to the channel that you use to send your survey to your respondents. Right? As we talked about a presence or absence of an interviewer matters a lot, for how people respond, due to social desirability and acquiescence pressures. It also matters a lot, whether the information is oral or written. In this case, of course also means that, they're listening to somebody else read them the questions. For a lot of surveys, you're going to be interested in mostly written. But, people when they're listening to survey read questions, remember the last thing first. If they're reading questions on a written form, they remember the first thing first. So, it's a very different cognitive processes for how we think about questions, whether they're going to be read by an interviewer, or they're going to be written down for self-administration. Another question to consider is, this a question from a previous survey, it's very common in academic survey research to use validated scales. We're going to talk more about what that means, but it basically means scales that have worked or questions that have worked in other surveys. It could be, that we want to compare a response to a product at this time versus what the people thought six months ago, or it could be that we want to create a baseline. So, we can see what a new campaign is going to do six months out. If you're comparing questions, it's going to shape what types of questions you want to have. Even small variations in questions are going to make it so that they're not comparable. So, if you have questions that you're repeating over multiple iterations of a survey, that's going to say a lot about what you can do. Another thing to consider, is whether respondents want to answer a question accurately. What does that mean? What's accurately the meaning in this case. It could mean a lot of different things, we've talked about for instance, some questions can be embarrassing, and because people don't want to embarrass themselves necessarily, they may not answer question. Or it could be that they guess, there's a term called satisficing, which means that people will make a best of decision. Right? They'll make a good enough choice, and a lot of these certain questions to get through it quickly. We want to make sure that people are carefully considering their answers, and not that they're barreling through it. Survey researchers have created a lot of different models for this kind of chain of interaction with a survey, and this is one that our authors of the book that we've been using have come up with. So, this starts with the perception of a survey question. Right? What does the respondent see or hear, when the question is being asked of them? That moves to comprehension, does the respondent comprehend what that survey question means? Do they understand individual words? Have we used academic keywords or jargon that they don't get? What does the respondent thinks, when they see that question? Then we have retrieval. After the respondent believes that they understand what's being asked from them, they have to think about. All right. What is my response to this? Now, that can be recalling information from memory, in the case of behaviors, that could be thinking about what do I truly think about this issue of civic engagement for instance, or it could be that they may have to go seek information, it gets about their financial records or something like that. But retrieval can be instantaneous, if it's easy to recall, or it can take time. Then, the respondent has to make a judgment, does the information they've retrieved match with what they believe the goals of the survey question are?. They've made a mental model about what you're really after, they gather their own information from their memory or what they believe, and now they're trying to judge, does that match up? Am I giving them a good answer? Then finally, there's a response, the respondent has to respond to your question and record it in some fashion, and understand the response categories, convert what they think into the limited set of options that you sometimes get them to respond by. So, this is just the start of our discussion of questions, to frame how we get into the respondents mindset, and think about what is the user flow for the survey respondent, as we start writing our questions. Getting good consistent responses to survey questions it's hard work, it's a real challenge, and it's one of the most underestimated challenges in survey work. You really want to consider, how is the respondent going to experience this survey, and what type of information are they going to provide you based on how they experienced that. In the next set of modules, we're going to be talking more generally, how do you write good survey questions, and then really diving deeply into specific types of research questions, and how do we frame them.