ARP Blog Post 4 – Research Methods

In the development of this intervention, I have relied on analysis of statistics and pedagogical theories as well as my own personal experience of design interventions in the structuring of crit sessions. In applying action research (McNiff & Whitehead, 2010), I felt that a productive source of data collection would be my professional peers. After considering research methods I decided an appropriate research method would be interviews as they enable an in-depth exploration of educator’s perspectives, assumptions and decision-making processes.

Analysing Interview models

I interviewed three of my peers from the School of Media. We all teach undergraduate courses, which have similarities in unit briefs and assessment approaches, and all have crits as part of our embedded delivery. I considered two types of interviews: the Interview Guide Approach and Standardised Open-Ended Interviews (Patton, 1980).

I recognised that the Interview Guide Approach could provide more ‘naturalness’ (Woods, 1986). However, by applying such a spontaneous approach, I risked getting carried away and missing key data on the participants’ reflection on their practice.

Due to knowing the interviewees fairly well on a personal level I felt that the element of ‘trust’ (Woods, 1986) was already established, so I decided that Standardised Open-Ended Interviews would be most suitable as it would leave a window of spontaneity in the responses. This would hopefully result in exchanges being more fluid and organic, providing me with richer and more nuanced data than a completely closed approach. However, its pre-determined interview schedule could ‘limit naturalness’ (Patton, 1980), something I would have to be aware of in the process.

Also, given our mutual institutional background I assumed that my interviewees wouldn’t be ‘neutral mirrors’ but that their ‘statements will still be impacted by professional roles and impression management’ (Alvesson, 2012). I hope that balancing prepared questions with openness to spontaneity in conversation, would reduce the chance of performance pressure.

Offline to Online

Given that my interviews were going to be a mix of online and offline (one in person, two on Teams) an awareness of the differences was important. My initial email contact to all interviewees was identical. I realised after my first in-person interview that further clarification would be sensible. I had not prepared my first interviewee (not sent interview schedule in advance, nor outlined topics further in an email response) but as this was in person and with an unlimited timeframe, we had the opportunity and time to do this prior to conducting the interview. Furthermore, drawing on Irvine, Drew & Sainsbury’s analysis of the differences between co-present and remote interviews (2013), I recognised that although my Teams interviews were visually face-to-face, they still shared several challenges typical of non-co-present modes, such as fewer non-verbal cues and increased need for clarification. To mitigate this, I provided interview questions in advance and began each session with a short summary of the project, which hopefully helped reduce uncertainty.

See attached the interview schedule [link].

See attached the links to the participant consent form [link] and information sheet [link].

Student feedback

Having not included student feedback in my previous intervention, I aimed to implement this in class a week after the crit session. This will be a key element in gauging the success of the intervention’s aspects as well as another opportunity for qualitative data collection. I aim to apply an evaluative process involving the following steps: perception, interpretation and evaluation (Gaertner, 2014). I’m also hoping the findings lead me to further theoretical exploration in successor research.

Bibliography

Alvesson, M. (2012) ‘Views on interviews: A skeptical review’, in Interpreting Interviews. London: SAGE Publications

Gaertner, H. (2014) ‘Effects of student feedback as a method of self-evaluating the quality of teaching’, Studies in Educational Evaluation, vol. 42.

Irvine, A., Drew, P., Sainsbury, R. (2013) ‘“Am I not answering your questions properly?” Clarification, adequacy and responsiveness in semi-structured telephone and face-to-face interviews’, Qualitative Research, SAGE Publications

McNiff, J. & Whitehead, J. (2010) You and your action research project. 3rd edn. London: Routledge.

Patton, M. Q. (1980) Qualitative evaluation methods, Beverly Hills, CA: Sage.

Woods, P. (1986) Inside schools: Ethnography in educational research, London: Routledge & Kegan Paul.

This entry was posted in Uncategorised. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *