Transcript for:
Interviews in Educational Research Explained

In this web lecture, we will discuss terms and concepts that have to do with interviews, which are tools that we can use to gather information about someone's perspective about a topic, in our case, related to the education sciences. First, we will define important terms and concepts. Then, we will discuss when to choose to use interviews in relation to overall research design. We will look at three example studies that use interviews. One based on a post-positivist paradigm, and two based on an interpretive paradigm. Next, we will discuss how to design an interview protocol, and how to conduct a successful interview. So let's start with some important terms and concepts. The first one is interview protocol. The interview protocol includes the guidelines for conducting the interview, as well as the interview questions themselves. A lot of times people will refer to the interview protocol as just being the list of questions. But before you ask the interview questions, you need to have a plan for whether and how you will ask follow-up questions, whether you will record the interview and transcribe it, and how you will protect the participants' anonymity and store data. These details are part of your interview protocol. With regard to the different types of question sets you can have, you can do a structured, semi-structured, or open interview. A structured interview is when the researcher asks the participants a predetermined set of questions in a prescribed order without changing the questions or the order of those questions. A semi-structured interview is when the interviewer begins with a list of questions that are carefully chosen to lead to a comprehensive answer to the research question, but then the interviewer may decide to add or leave off questions during the interview. The interviewer may spontaneously ask follow-up questions to gain further information, change the order of the questions in order to follow the participant's natural conversation, or leave off questions that have already been answered by the time they are reached in the question list. And then finally, an open interview starts with broad prompts instead of specific questions and positions the participant's story as the central focus. As you may have guessed, each of these interview approaches aligns with a particular research design as we have discussed in previous lectures. Remember that your research design needs to take into account all of these various pieces and form one complete whole. That means that the epistemology that you believe in needs to align with the research paradigm and so on. On this diagram, you can see the various connections that you need to be able to explain between the different parts of your research design. When we talked about epistemology previously, We named objectivism, constructionism, and subjectivism as possibilities. The three types of interviews we just defined can be more or less aligned with these three different epistemologies. Here's what I mean. When you do a structured interview, you are positioning the researcher as the determiner of what needs to be asked and in what order. The researcher makes that decision based on a deductive process that has to do with filling a gap in knowledge. Since this process has been scientifically validated and done objectively, it should not matter who the researcher is that does the interview, and the results should be generalizable and replicable. Typically, a structured interview study will also have a relatively large sample size compared to the other two types of interview studies. So if you are adopting an objectivist epistemology and a post-positivist research design, you will probably choose a structured interview method. The semi-structured interview method positions the researcher and participant as co-constructors of the interview data. This approach reflects a constructionist epistemology and interpretive or social constructivist research paradigm. Remember that a researcher using this design typically believes that the researcher's presence, experience, social position, and other traits always influence the research process. So rather than aiming for objectivity, the social constructivist researcher aims for trustworthiness. The goal of a semi-structured interview is not to generate a generalizable or replicable interview, but rather to best answer the research question according to the interaction between researcher and participant. The researcher may ensure quality in this type of design by conducting a member check with the participant after the interview is complete, in order to be sure that the information captured in the interview includes everything that the participant finds important about the topic. The final type of interview we defined above was the open interview. The open interview is typically used in narrative methodologies such as biographical studies or life histories. It reflects a subjectivist epistemology that places as central the lived experience and story of the participant. The participant is free to focus on the elements of the topic that the participant finds most important, which is not usually the case in the other two approaches where the researcher writes the list of research questions. So the type of interview you design and conduct. should match everything else you wrote about your design. In other words, you should not say that you are conducting an open interview that is valid and reliable, because an open interview will be different every time it is conducted and will address whatever the participant finds important. Standards of validity and reliability belong to a post-positivist research paradigm, whereas standards of transparency and trustworthiness ensure quality in a social constructivist paradigm. And various standards, for example related to an ethic of care or equitable participation, ensure quality in a postmodern or subjectivist paradigm. This is a good time in this lecture for a reminder that when you watch these lectures that they offer guidelines and syntheses of other researchers' work. For example, these brief sentences related to research quality synthesize dozens, if not hundreds, of articles about this topic. That means that you will ultimately need to do your own reading and thinking in order to make good decisions about your research study. There's no shortcut or checklist you can use to do this work for you. The papers from the examples in this lecture are a good place to start to see how other researchers have conducted their studies. You can and should refer to them and also find other examples as you are making your choices and conducting your own study. Let's look at some of these examples. The first example uses the method of structured interviews. It describes the process of developing and piloting a structured interview that can be used with children to understand how children think about their abilities to make their own decisions and exercise autonomy in classrooms. We will summarize the research design in this overview. The topic of the study was children's perspectives about their participation in early childhood classrooms. The study was conducted from a post-positivist paradigm. The researchers determined categories of participation and designed interview materials based on the literature. They asked all children the same set of questions in the same order. categorize the children's answers according to existing literature, and sought to generalize across settings. The interview protocol was designed so that any researcher could follow the same procedures and come to the same results. The theory and research on which the protocol was based came from the fields of participation in early childhood education and care settings, as well as literature that describes children's perspectives on participation in general. This was a structured interview study which draws upon the methodology for a validation study since it is a pilot study in which the protocol will be validated. The researchers interviewed 43 children in Portugal using the structured interview process. Here you can see just a snapshot of part of the protocol. You can view the whole thing on pages 57 and 58 of the study. You see how this protocol gives an overview not only of the interview questions, but also of the materials to use and the reason for each step. This protocol involves reading a narrative either about participation or non-participation, and then showing children images of a classroom and using the narrative and images to elicit responses. To analyze the data, the researchers categorized children's open-ended response as positive, negative, or neutral, and use children's responses to create the categories of classroom purposes as the teacher chooses, play, and work. The researchers then gave descriptive statistics linking children's responses to each of the two types of classrooms. They used additional statistical tests to ensure that sociodemographics did not affect the results, which is important in a validation study. This is a post-positivist study using a structured interview method. In our next example, we will see how Monica Louse and colleagues used semi-structured interviews in an interpretivist research design. They were curious about how teachers themselves talk about their own learning goals, since existing research had been primarily about what people other than teachers thought that teachers should learn. Here's an overview of their design. They were curious not only about teachers' goals for their own learning, but also how these goals connected with teachers' career phase. Although their research question is phrased like a generalizable question that we would see in a post-positivist study, we can see that they actually used an interpretive approach. How do we know that? Well, the main researcher spent quite an extended amount of time in the school context in order to understand how to situate the study, which participants would be key to include, and to build rapport for the interview. When rapport is considered important, that is a good clue that the study is social constructivist. Attention to rapport between people shows the importance of the social interaction in the construction of knowledge. The research question seems like it would involve a statistical correlation between variables, but what the authors actually did was create a typology of different learning goals based on participants' responses and then look to see if there were trends between participants' learning goals and career stage. The outcome of the study was a typology or theory about this connection, not a statistical test or conclusion, which is more evidence that the authors are using a social constructivist design. We will see in the next slide how they used their theoretical framework and literature review to create their interview questions. On page 492 of the study, you can find this nice table that gives explicit details about how the interview questions were derived. In the left-hand column, you see a component of the overall framework guiding the study. In the middle column, you can see examples of the literature sources from which these perspectives came. Then, in the right-hand column, you can see how these perspectives were translated into interview questions. In this study, the interview was meant to answer the part of the research question about teachers' perspectives about their own learning. Then in the analysis, the authors synthesized individual teachers' answers by career phase so that they could investigate trends in teachers' perspectives according to those career phases. You can see very nice explicit evidence of this process listed by participant in Appendix A on page 504. Although we are not specifically focused on analysis here, Each of these example papers has information about analytical procedures that you could also draw upon when making choices about analysis. For the third example, we will look at a study that I did at an alternative school in the United States. If you watch other lectures in this series, you will see this example a few times, and that's because it contains examples of several of the different concepts we will discuss, and also because I happen to know it really well. For the purpose of this lecture, we will look at how I use semi-structured interviews in this study. The topic of this study was how teachers' practices shape students' experiences at an alternative school. This study was conducted from an interpretive or social constructivist paradigm. In the paper, I wrote about how my experience as a former alternative school teacher informed my decision-making, my ability to build rapport with participants, and my interpretations of data. Just as we saw in Lau's and colleagues' study before, in an interpretive paradigm, researchers do not make truth claims about generalizability to other contexts, but instead become conveyors of facts, relationships, and ideas that occur within a particular specific context. Standards for rigor in qualitative studies that are done from an interpretive paradigm have to do with trustworthiness rather than validity and reliability. In other words, The goal is not to eliminate the researcher's bias and demonstrate that a study can be replicated, but rather to assert that the study cannot be replicated, but that the researcher has engaged in particular strategies to increase transparency and provide details so that readers can determine the applicability of the findings to their own contexts or research questions. So the research question was framed in a way that addressed the nuances of practice, which my own teaching experiences helped me to see and to capture. The question was, how do teachers' implementations of the three domains of teaching shape students' experiences in one disciplinary alternative school? The methodology of the study was a cross-case analysis of the cases of five teachers at the Data collection included interviews and document analysis along with observations which then served as different data sources that could be triangulated, which is one way to ensure rigor in an interpretive qualitative study. So these were the considerations I needed to take into account when deciding upon an interview protocol. Remember that I needed to consider a different protocol for students, teachers, and administrators in this study, so that meant that I needed to translate the theoretical framework and research question into different interview questions depending on what I wanted to learn from each group of participants. You can find the complete list of interview questions for each of these groups of participants in Appendix A of the paper, and you can see there that the questions are organized according to the topics of the theoretical framework. Let's keep drawing upon these examples as we discuss how to design an interview protocol. I've designed a document called Interview Guide that you should be able to access along with this lecture. I've just put a snapshot of the top of it in this slide. This Interview Guide includes a complete protocol for conducting an interview for the master's thesis using a semi-structured interview protocol. Of course, you have to insert your specific interview questions into the protocol, but it includes tips on how to manage the interview data, the things you need to remember, such as extra batteries for your recording device, and to get informed consent in advance. It also includes the basic tips we will go over next regarding how to conduct an effective interview. Remember that the focus here will be on semi- structured interviews since in structured interviews you will simply use the validated protocol that already exists and in open interviews you will base your approach on your methodology. But before you can conduct the interview, you have to write the interview questions. There's really no specific set of questions or steps that you can always use to produce the perfect list of interview questions. Here are some considerations that Laus and colleagues used in their study and that I also used in my study so you can see these examples explicitly described in our papers as I mentioned above. We considered our theoretical frameworks and literature review, and how those led us to think about the topic in a certain way, and pointed towards certain important ideas and experiences that we should ask our participants about. We also had to be sure that the collection of questions we asked would answer our research question. In the next few slides, you will find more details and ideas about how to write and order your interview questions. When you write your interview questions, you will start with your research question and then think about the sorts of questions you will need to ask your participants that will help you answer your research question. Remember that in your study design, and maybe even in the research question itself, you have included very specialized words and concepts that participants may not recognize. So that means that your task is to figure out how to ask participants about the main ideas you're studying. without using those specialized terms. Here's a table that gives an example of how to convert your research question, which is listed in the column on the left, into sample interview questions, which are listed in the columns on the right. The arrow above the table shows how you will move through the process from left to right. You'll start on the left and move through each column until you get to the interview questions on the right. So in this example, the research question is, how do students experience study-related stress when writing their thesis? You may recognize this topic very well. However, if you were a participant trying to answer this question, you may not know what is meant by the special term study-related stress. So the researcher needs to conceptualize or identify or define the concepts that make up the phenomenon of student-related stress. In the second column, you can see that study-related stress is defined as physical and mental pressure. So then in the third column, we operationalize the concept of study-related stress into these two sub-concepts of physical pressure and mental pressure. Then, as the researchers are doing this operationalization, we have to think about how the participant could recognize that they were experiencing physical or mental pressure. How could they know if they had had it? And how could the researcher recognize if they were talking about it? To answer these questions, we need to identify the specific indicators of physical and mental pressure, and we would use the literature to help us do this. You can see examples of indicators in the fourth column. Physical pressure might be indicated by exhaustion, pain, having a tense body, or maybe something else that the participant might add, which is why you see an additional empty row there. In interpretive qualitative inquiry, we want to leave room for input from participants that might not yet appear in the literature, or about which the researcher might not yet have thought. Then in the final column, you can see how the researcher turned these indicators into questions. The questions are designed to allow the participant to discuss specific indicators of physical pressure. Although we are not discussing data analysis in this lecture, you might also note that the indicators listed here could serve as themes, depending on the methodology you are using and the actual data you end up with. In the final row of the table, you see the additional example related to mental pressure. Mental pressure might be indicated by lack of engagement or preoccupation. We could get the participant to address mental pressure by asking them to give details about their experiences while writing the thesis, with specific follow-up questions related to emotions, mood, and preoccupation. So this is one example of how you could move from research question to interview questions. Let's discuss some additional strategies you could use besides a question list to get participants to discuss experiences. The first one is the use of artifacts. Artifacts is a fancy word that means physical objects or formats that are used to generate discussion. You can think about the category of possible artifacts as containing things provided by the participant and things provided by the researcher. What we mean here is that you can ask participants to think ahead and prepare something or bring something to the interview that you will then discuss. Examples include poems, vignettes, a vignette is a short example that illustrates something specific. You could ask them to bring stories or photographs. If you ask the participant to prepare something, you will need to be very specific about your request and you will also need to consider whether and how you will collect these artifacts for the data analysis. If you collect artifacts, you will need to consider possible ethical issues, such as individuals who can be identified in photographs. Perhaps the simplest option is to incorporate the description of the artifact into the interview and then not collect the artifact itself. For example, if you've asked the participants to bring photographs from the past that illustrate a particular thing or experience you want to discuss with them, you can ask them to first describe what's in the photo during the interview. Then, The contents of the photo are recorded along with the interview transcript and you don't need to collect the actual visual photo or see it again during the analysis. Similarly, you can have the participant read any written documents they prepared in advance during the interview itself, so that the written document also becomes part of the interview transcript. On the right hand side of the slide, you see examples of artifacts that the researcher can bring or include in the interview questions themselves. In example one, we saw how the interview protocol included images that went along with a narrative that the interviewer showed to the children who were answering the interview questions. This is an example of how the researcher might use an image. Another example is that the researcher might use excerpts from an existing quantitative protocol to elicit participants' opinions about the questions there. Remember that this approach, as I'm describing it, doesn't include the participant taking the survey, but rather uses the survey questions as qualitative interview artifacts. If you want to have the participant take the survey and then give feedback to the participant during a qualitative interview, that would be an example of a multiple method study. That would require a different analytical approach than an interview alone. The remaining two examples of researcher-supplied artifacts, stimulated recall and metaphor, will each be further explained on the slides coming up next. Let's start with stimulated recall. Stimulated recall is when the researcher uses observation data as a prompt to get participants to talk about the topic under study. The observation data might come from the researcher's own field notes or from a video recording. The moment chosen may come from the researcher or from the participant depending on the study design. Then the researcher and participant view or recall the moment together and the participant talks more about what happened during that moment. We will look at this example by Kortagen and colleagues of how they used stimulated recall. The design for this publication contains two studies so we won't get into the details here. You can read the study for yourself but we will just give some highlights here of how stimulated recall was used. In this study, researchers recorded 10 to 20 minute long videos of teacher-student interactions, and then those videos were used during the interview sessions. The teachers were asked to identify the two best contact moments in the video. In this way, their memory of the moment was stimulated, but the researcher also saw examples of how the teachers defined best contact moments. Participants were then asked more about these moments, and researchers used their answers to define good contact moments. Participants were also asked about the moments they did not choose, and why they did not choose them, which provided additional data during the interview. So this is an example of a study that uses an artifact that the researcher provides. In this case, the participant is selecting which part of the artifact is most relevant. but the researcher is providing the artifact as a source of discussion for the interview. The final example of a researcher-supplied artifact we will discuss is metaphor. A metaphor is a comparison of a thing or experience to another symbolic thing or experience in order to illustrate a particular part of its essence. That's a bit of a complicated explanation. So we're going to look at this specific example by Visser, Weinveen, and colleagues to help us. In this study, the researchers wanted to understand how academics conceptualized knowledge, research, and teaching because they were particularly interested in the relationship that academics themselves identified between their research-oriented activities and their teaching-oriented activities. In this study, 30 academics from one faculty of humanities at one Dutch university were interviewed. The researchers decided to use metaphors as prompts because academics are often unaware of their conceptions of knowledge, research, and teaching. And so asking about these things directly may not have yielded very detailed or knowledgeable responses. You can read more details about how the researchers incorporated the metaphors into the interview protocol. and how they chose the metaphors to use if you look at the study itself. Note that they had very clear and transparent reasoning for how they derived the metaphors and how they made their choices. On this slide, you can see the metaphors they used. For each of the concepts under study, knowledge, research, and teaching, the participants were asked to respond to the set of metaphors choose the one that appealed most to them, and reflect on how they think their answer might have changed since they entered the university. The researchers then conducted an analysis in which they first articulated key statements that captured the most critical parts of what each participant said. They grouped similar participants and then inductively named the different perspectives on knowledge, research, and teaching present in the sample. They proceeded to conduct quantitative analyses using these groupings, which you may also decide to use according to the research paradigm guiding your study. When they did this analysis, they were able to identify correlations between the way academics view knowledge and research as well as the way academics view knowledge and teaching. So this is just one example of how you might use metaphors as part of your interview protocol. Next, we will go over some details about how to word your questions. Now that you have some ideas about the questions you would like to ask, let's go over a few tips to keep in mind for choosing the words you will use. First, consider using what James Spradley calls tour questions. A tour question asks a participant to take you, the researcher, on an imaginary tour of an event or experience. Here's an example of a tour question. When your team meets to discuss student test scores, could you walk me through what happens from beginning to end? Be sure to define terms and avoid jargon. Jargon includes any specialized terms that ordinary people don't use during ordinary conversations. A good rule of thumb is if you were telling someone like a family member or friend who's not studying about your thesis, would you use this term? An example of jargon is, what are your experiences with identity regulation? A participant is unlikely to know what identity regulation is, so that is a concept that you need to further translate into an interview question using the operationalization process that we discussed earlier. Next, try to ask open-ended questions instead of yes-no questions. Remember that you want the participant to talk in as much detail as possible. You can use yes-no questions to check whether a follow-up question is relevant or not, but there should always be a follow-up question to a yes-no question if you use them at all. Ideally, you will use only open-ended questions. A final tip you should do is ask authentic questions, meaning is this a question that you are really curious about? Try to limit your questions to those that you're really curious about in your study. And then there are also a few things you should try to avoid. First, avoid asking more than one question at a time. If you ask more than one question, The participant may get confused and is likely to only answer one of the things that you asked. Also, avoid using assumptions or judgments in your questions. An example of an assumption is, how have you used data to change your instruction? There's an assumption in the question because the researcher is assuming that the participant has used data to change instruction, which might not be the case. Try stopping at the word data. to form a less judgmental question of how have you used data? Or even better, how have you or other teachers you know used data? In this final version of the question, the participant can still speak to it even if the participant themselves does not use data. The last thing to avoid is asking leading questions. A leading question steers a participant in a certain direction rather than letting the participant make the judgment. An example of a leading question is, what is good about teachers using data to make instructional decisions? This question is leading because the participant might have 20 negative things to say about using data and one positive thing, and you will only arrive at the positive thing by using this question. You have steered the participant away from discussing the negative things, even if they outnumber. or outweigh the positive things in the participant's perspective. On the next slide, we will look at a few more ways that some novice researchers ask leading questions and we will show you how to reword them. In this table, we define three types of leading, one in each row. So you see the three types of leading are being suggestive, being normative or evaluative, and using interpretive labels and categories. In the middle column, you can see an example of each of these types of leading, and then in the right-hand column, you can see how to reword the question to avoid that type of leading. So in the first row, the example of leading is, if you say this, do you refer to x? And... You're suggesting that the two things then are related by phrasing the question that way. So instead of doing that, you can say, to what do you refer to when you say this? And it's much more open and then the participant can decide what the relationship between things is. So you're making progress on wording your questions and now you need to consider what order to put the questions in. Your task is to strategically order the questions in a way that will support the conversation. Consider how to build rapport with the participants. Usually, you will want to ask broad overview questions first, and make sure not to ask questions that might be sensitive when you first start the interview. Build rapport first by asking less sensitive questions. In example study 3 above, I also talked with students, so I had to consider how to word the questions so that they would be developmentally appropriate and students would understand them. This was an additional consideration. Even if you take all of these things into account and do your best to make a clear and comprehensive set of questions, there's a good chance that something unexpected will happen when you actually ask the questions. In order to gain a perspective about what these unexpected occurrences might be, You should pilot the interview and then make revisions to any questions that the participant finds unclear or that lead to answers that are different than what you anticipate and perhaps unrelated to your research question. We will discuss more details about conducting a pilot study in a moment. First, we will go over a few tips about how to conduct the interview itself in terms of what to do, what not to do, and how and when to ask follow-up questions. Once you have completed the questions and are ready to try them out, there are a few tips that can help you have a successful interview. These tips apply to semi-structured interviews and potentially also to open interviews, but not to structured interviews since you will not formulate new follow-up questions during the structured interview itself. In the interview guide, I summarize the points I find most useful from two chapters, one by Irving Seidman and one by James Bradley. Here are the tips from Seidman. You can find details about how to do each of these things in his chapter. One of the things I find that new interviewers sometimes have trouble with is knowing when to ask follow-up questions that are not written on their list of questions. That is why I recommend keeping a copy of your research question in front of you. During the interview, you should always be asking yourself what additional information might be useful to know about each answer the participant gives. You can decide this by keeping your overall research question in mind. Have a look at these tips, and then I will show you one example of how you might rephrase an interview question based on how the interview is unfolding. Okay, here at the top of this slide you can see the original list of interview questions from Example study 3 and these are in the appendix A of that paper. What you see below is a segment of the actual transcript from the interview. I am B. So you can see the question I asked in italics next to the letter B. It's question 9 from the interview question list. But then you can see that the next question I asked was a modified version of question 10 based on the answer the participant had given to question 9. At the end of his answer, which is the first answer that's listed here, the teacher participant was describing that the students did not feel like they needed to put in effort, which already starts to deal with the content of question 10, which has to do with what the students need. in order to be successful. It seemed like asking question 10 as it's listed would move us away from the flow of the conversation. So I asked the follow-up question based on what the teacher had just said about what he perceived the students did and did not know. I regarded this follow-up question as related to question 10 but paraphrased it in a way that was more conducive to letting the participant continue to tell his own story. This change allowed the participant to know that I was listening carefully and not being repetitive or ignoring what he was saying. This change is an example of what we mean by semi-structured, which, remember, can mean that you change the order or phrasing of questions or decide to add or delete questions based on how the interview is unfolding. Once you have followed all these steps and are ready to try out the tips, it's time to practice by conducting a pilot study. First, you need to choose a possible participant. Identify someone who is close to your sample but wouldn't be part of your sample maybe because they don't meet one of the inclusion criteria. The important thing here is that you choose someone who's knowledgeable and experienced enough to be able to answer your questions as fully as your participants would. Be sure to follow informed consent and data storage procedures because you might use the data afterward if the interview goes well and if in fact the participant ends up being someone who could belong to your sample after all. During the pilot interview, you should record it and also try to follow the tips in this lecture the best that you can. Then, After the interview is over, you should immediately ask the participant for feedback about the questions and take note of their answers so that you can revise the questions later. Next, you should transcribe the interview. Keep in mind that for every 15 minutes of recording, you will need about an hour to transcribe. That means that for a 60-minute interview, you should carve out a four-hour block of time afterward to spend on transcription. After you finish transcribing, you will determine whether the interview is the appropriate length, and you will analyze your interviewing behavior according to the criteria mentioned before. Pay particular attention to how much you talk in comparison to how much the participant talks. The participant should do almost all of the talking. Remember that you can only include the things that the participant actually says when you do the analysis. So whatever you say as the researcher doesn't count as data. Use the information you gain to revise your interview protocol, practice your interview style, and start the real thing. So this brings us to the end of this lecture on interviews. Remember that we learned the terms interview protocol, structured interview, semi-structured interview. open interview, stimulated recall, and metaphor. If you're unsure about what those mean, go back and look carefully at the example papers and previous slides. In addition to these terms, we discussed how your choice between structured, semi-structured, and open interviews will depend on your research paradigm. If you're using a post-positivist paradigm, you will probably choose a structured interview. And if you are using a social constructivist paradigm, you will probably choose a semi-structured interview. If you are doing a narrative study, you might choose a semi-structured interview or an open interview. You will need to investigate methodology a bit more before you make that choice and use some additional materials to help you. Once you've chosen an approach, you will write your questions. To do that, you will rely upon your theoretical framework, literature review, and research question. You will operationalize your research concepts and use words that participants will understand. Several of the examples we discussed show how other researchers have done this. You will follow the guidelines for wording your questions and conducting the interview. Then, once you have a list of questions, be sure to pilot it or test out and see how well it works.