Volume 16, No. 3, Art. 27 – September 2015
Dear Critics: Addressing Concerns and Justifying the Benefits of Photography as a Research Method
Kyle Miller
Abstract: Photography serves as an important tool for researchers to learn about the contextualized lives of individuals. This article explores the process of integrating photo elicitation interviews (PEI) into research involving children and families. Much literature is dedicated to the general debate surrounding the ethics of visual methods in research, with little attention directed at the actual process of gaining study approval and publishing one's findings. There are two main critiques that researchers must face in order to conduct and disseminate studies involving visual images—ethics committees and peer reviewers. In this article, I identify and discuss some of the challenges that emerged across gaining protocol approval from an ethics committee in the United States. Ethical concerns and restrictions related to the use of photography can delay data collection and create barriers to research designs. Similarly, I describe the process of responding to reviewers' concerns as part of the publication process. Peer reviewers' lack of familiarity with the use of photography as a research tool may lead to misunderstandings and inappropriate requests for manuscript changes. While many concerns are sound, the range of benefits stemming from the use of visual data help to justify the time and energy required to defend this type of research. Implications are discussed for researchers using visual methods in their work.
Key words: visual methods; photo elicitation interviews; ethics in research; peer review
Table of Contents
1. Introduction
1.1 Photography as a research method
2. Approval to Conduct a Study
2.1 IRB critique: Do you have to use photographs?
2.2 Response: Yes, photographs are necessary
2.3 IRB critique: Protecting confidentiality and gaining consent
2.4 Response: Minimal risk
2.5 Compromise: Photography with restrictions
3. Approval to Disseminate Findings
3.1 Peer-review critique: Were the pictures staged?
3.2 The images do not stand alone
3.3 Peer reviewers: Quantifying images
3.4 Response: Inadequate representation of findings
4. Conclusion
I often tell people that I did not find photography, photography found me. My introduction to the use of photography as a research tool came in a very unsolicited and unexpected way. A number of years ago, I was working on a large school readiness project. Part of the project required me to conduct interviews with parents on their memories of school and current practices aimed at academically socializing children for kindergarten. While interviewing one mother about her school memories, she told me to "hold on," and disappeared into a back room for an extended period of time. She eventually returned with her high school yearbook and walked me through a number of pages displaying her adolescent years in school. She pointed out her oversized glasses, which she claimed were in vogue at the time. She called attention to the pictures of some of the "popular" girls who never spoke to her during those four years and the 16-year-old image of herself standing in the back row of the English club1). There was something very compelling about the interview with this mother. The imagery of the yearbook offered a great deal of information that was not fully communicated through words alone, supported the participant's reflection, and strengthened researcher-participant rapport. [1]
The photographs did not end with this mother. In subsequent interviews, mothers and fathers grabbed pictures from photo albums or sorted through images on their phones to visually identify people and activities that were important to the development of their child. Clearly, there was a richness to these pictures, which changed the dynamic of our conversation. The photographs evoked emotions and information that did not emerge from my interview script. These visually based conversations marked the beginning of my interest in exploring the utility of photography in research. This article describes my journey into the visual world of research within the context of the United States. [2]
The article is divided into three main sections. The first section provides a brief overview of photography as a data collection tool, and some of the initial ethical concerns that may arise when used with human subjects. The second section documents the process of gaining approval from an ethics committee to incorporate photography in a study involving children and families. I share an example of the iterative negotiation process and compromises that were required to conduct the study. The third section focuses on the dissemination of the study's findings through the peer review process, and a similar negotiation process with reviewers to protect the integrity of the photographic data and meaning for the study's participants. [3]
1.1 Photography as a research method
Photography is a growing trend that reflects the growth in the affordability of and interest in a variety of technologies (ROWE, 2011; YATES, 2010). Social scientists have used photography since anthropologist John COLLIER (1967) introduced it as a valid and useful method for collecting data. Although photography can be incorporated in a number of ways, the photo elicitation interviews (PEI) method is my approach of choice (BANKS, 2001; HARPER, 2002). Photo elicitation, also referred to as photo interviewing (HURWORTH, CLARK, MARTIN & THOMSEN, 2003), involves the use of photographs to evoke comments, memory, reflection and discussion in the course of a semi-structured interview (HARPER, 2002). Specific examples of social interactions and activities depicted in photographs can serve as the foundation for a discussion of broader abstractions or fine-tuned details of the images (BANKS, 2001). [4]
As I gravitated towards this method, there was an excitement around the kind of data photography could lend to the construction of a more holistic understanding of children and families. It was during the planning of my first photography-based study that I shared my camera ideas with a colleague. Her response was simple and direct: "Good luck getting that through IRB [Institutional Review Board]." The comment was delivered with a great deal of sarcasm, and was my first warning that this kind of research would invite some obstacles. Her response provided one explanation as to why photography is rarely used in studies in the field of education—the challenges associated with research approval. In fact, this colleague was not alone, a number of individuals encouraged me to abandon methods related to photography in fear that it would delay my research plans. [5]
The lack of agreement about the ethical and moral issues associated with the use and presentation of photographic data has prevented the widespread use of this useful and valuable research method (BANKS, 2001; PITT, 2014). Globally, ethics committees differ greatly in how they review and process study protocols; and therefore, approval is given variably (REDSHAW, HARRIS & BAUM, 1996). Even within the United States, the Common Rule is not implemented uniformly in reviewing and approving human subjects research (LIDZ et al., 2012). With visual methods, it can be an onerous process of gaining ethics approval and consent, as well as finding sites and participants who are willing or eager to participate in a photography-based project (PITT, 2014; YATES, 2010). Authors rarely address the ethical review process; and instead, focus on describing how they obtained permission (e.g., CLARK-IBANEZ, 2004) almost creating the illusion that IRB is not an obstacle in conducting research using visual methods. A recent study found that although many visual researchers voice concern about ethical regulations, few researchers actually have authentic stories related to their personal work (WILES, COFFEY, ROBISON & HEATH, 2012). On the contrary, I have a story to share. [6]
As warned, gaining approval and publishing findings was an onerous process for my "Transition to School" study. I further discuss the approval and publication journey in relationship to the two main gatekeepers and critics of visual methods—ethics committees and peer reviewers. Whereas a number of articles contemplate the ethical ramifications of photography in research (e.g., WILES et al., 2008), limited scholarship addresses the actual process of gaining approval and defending the use of visual methods in one's dissemination of research. This article addresses that gap by describing the complicated progression of generating a photography-based study, addressing concerns expressed by the critics, and justifying the benefits of using visual images in research. [7]
2. Approval to Conduct a Study
As part of a larger study on the Transition to School, I attempted to incorporate a sub-study involving photography. Taking the PEI "autodriven" approach (CLARK, 1999; CLARK- IBANEZ, 2004), I proposed that parents would be in charge of capturing visual images of activities with their children over the course of a week with a digital camera. Those photographs would then serve as the foundation for a follow-up interview aimed at determining the meanings and importance of the photographs for preparing children for school. [8]
Through the process of gaining protocol approval for this sub-study, I confronted a series of ethical concerns identified by a large university's institutional review board. Ethics committees are important features of the research process because they can permit or stifle research endeavors (EDWARDS, ASHCROFT, & KIRCHIN, 2004). Review boards play an essential role in protecting the well-being and interests of humans involved in research. Research protocols are reviewed under seven criteria to ensure that researchers do not harm participants—1. risk minimization, 2. risk/benefit comparison, 3. equitable subject selection, 4. informed consent, 5. data monitoring to ensure safety, 6. privacy protection and confidentiality, and 7. protection of vulnerable subjects. Despite the role and importance of ethics committees and the review process, little is actually known about how they each function (LIDZ et al., 2012). It is not a clear or consistent process, but does rely heavily on a conservative positivist perspective which is often inappropriate and incompatible with qualitative social research (LINCOLN, 2008; PITT, 2014). There is also great variability in who and what is approved, calling into question issues of reliability and equity (SHAH, WHITTLE, WILFOND, GENSLER & WENDLER, 2004). The committee's knowledge and understanding of visual ethics, which is often limited, can place such studies at a disadvantage in gaining approval when committees lack familiarity with number and word-based protocols (WILES, CLARK & PROSSER, 2011). Below, I share the concerns of one ethics committee that reviewed my protocol, my justifications for the inclusion of photography, and the eventual compromise involving the use of visual images. [9]
2.1 IRB critique: Do you have to use photographs?
After my first protocol review, the IRB deemed most of the project in accordance with the Code of Federal Regulations2), often referred to as the Common Rule, with the exception of the photography portion. The feedback suggested that I find a less invasive way to gather the same information, such as allowing participants to document activities in writing or through a traditional interview. Although such feedback is frustrating, the committee voiced a legitimate concern, or lack of visual understanding. Much like WILES et al. (2011) suggest, "[v]isual researchers will be invited to change important components of research design, in order to avoid breaking with number and word based conventions" (p.692). [10]
My protocol lacked a compelling argument for the inclusion of photography and PEI's, and the ethical ramifications connected to visual data increase the need to fully examine the reasons for choosing visual methods. It places a duty on researchers to be confident that the advantages of the visual methods, compared with other methods, outweigh the additional ethical uncertainties (PAIN, 2012). There are a number of innovative ways to collect data, which can lead scholars to lose sight of why particular methods are chosen (BESSELL, 2009). Was I selecting the method for its avant-garde qualities or was it essential to my research design? (CLARK, 2013). Based on IRB feedback, I either needed to justify that the methods were essential to the purpose of the larger study, or I needed to let it go. WILES et al. (2012) state, it is about "making the case" to ethics committees and being as thorough as possible in defending your methodological choices. [11]
2.2 Response: Yes, photographs are necessary
Like many other researchers, my research plan failed to defend my choice of methods, as I assumed my choices were clear. In fact, few studies using visual images actually defend or explicate their choice of the method (PAIN, 2012). There are many compelling reasons to use visual images, but they may not be obvious to your audience or critics and must be purposefully stated since the knowledge and comfort level with visual methods tends to be scant in the research community (WILES et al., 2011). In qualitative research, visual images provide a range of advantages, which include accessing the difficult to reach, sharing power with participants, facilitating communication, accessing difficult information to reach and drawing on different cognitive processes (BANKS, 2001). [12]
One of the most important justifications for this method is research that supports the different cognitive processes involved in exploring visual images and using language to explain ideas. The fact that language processing uses some areas of the brain that are different from visual information processing lends credence to the view that visual methods can open up a "different way of knowing and telling" (PROSSER & LOXLEY, 2007, p.63). Therefore, visuals can be useful in many contexts where language-based communication may not access the situation fully (PAIN, 2011). Visual methods are not necessarily a better method, but are a useful addition to language-based methods. Visual methods assist researchers in thinking differently about a topic or phenomenon—not necessarily more deeply or truthfully, but differently (HILL, 2013; SCHWARTZ, 1989). The claim that combining the two leads to richer and more holistic data seems plausible because verbal and visual processing use different areas of the brain (PAIN, 2011). Participants find visual representations easier to talk about, more comfortable and more interesting than just a verbal dialogue (POWELL & SERRIERE, 2013). This was apparent in my prior work with families, who sought out visual images in their home to complement their verbal responses. [13]
Further, although it is our ethical responsibility to protect participants involved in research, it is also an ethical responsibility for researchers to find ways to share power and the research process with participants. For years, sociology, feminism, and cultural studies have been concerned with giving voice to the people who are the subjects of the research and to find ways for individuals to interpret their own lives, rather than presenting these purely through the agenda of the researcher (YATES, 2010). Unfortunately, this happens less often in the field of education. Many researchers choose a visual method as a means of addressing the imbalance of power between researchers and participants (PAIN, 2012). By listening to individuals' own interpretations, authority is passed from the researcher to participant. [14]
The research lexicon has shifted from using the label "subjects" to "participants"; however, I do not believe it has altered the ways in which we engage with individuals involved in research studies (CORRIGAN & TUTTON, 2006). Participants are still treated as "subjects" in many studies. Taking an auto-driven approach with families was a way to shift power and allow participants to truly "participate" in the research process (CLARK, 1999). In my study, families were given more control over the data to act as participant-researchers. This was achieved by asking participants to photograph their worlds with fewer restrictions than most studies of children and families (HILL, 2013). Families were not only a data resource, but additionally a resource for the organization and analysis of the data (JENKINGS, WOODWARD & WINTER, 2008). [15]
Photography also creates a forum for the representation of gestures, expressions, emotions, dialogue and contexts in ways that written notes cannot. Participants are able to share the task of collecting and interpreting data, which helps to create a more equitable and informative research environment (BROOKS & WANGMO, 2011; PINK, 2007). PEI's do not necessarily take longer to complete than traditional semi-structured interviews (O'BRIEN, 2013), but can provide very different data (MEO, 2010). There are currently few studies that directly compare non-visual methods with ones that incorporate visuals (ibid.); however, it was abundantly clear in my previous and current work that the photo elicitation process contributes unique data to studies. Further, this method holds the capacity to disseminate data in a distinct way. Photographs and images grasp the reader's attention. Images draw readers into the text and provide a realness that is hard to capture with words. I liken it to watching a presentation with copious amounts of text, versus a presentation that is visually and textually balanced. Photographs are powerful tools that often provide a variety of benefits to the audience (MACINTOSH, 2006). [16]
These justifications were convincing enough to have the sub-study reconsidered. However, it was then met with a second wave of concerns related to confidentiality and consent of participants. Although photography would be permitted as a method, the details and logistics of exactly how it would be used were tenuous. Although I clarified its need, did it violate issues of confidentiality? [17]
2.3 IRB critique: Protecting confidentiality and gaining consent
The principle of respect for autonomy and confidentiality within ethical guidelines presents an ongoing challenge to conducting visual research (CLARK, 2013; PAPADEMAS, 2009; WILES et al., 2008). With text-based data, harm is usually avoided by de-identifying the data, but with recognizable images of a person, confidentiality cannot be guaranteed. In fact many of my participants were eager and willing to waive confidentiality, but ethics committees often disallow it, which can compromise the aim of the project and attempts to empower participants (PAIN, 2012). Further, the common strategy of placing a black strip over a participant's eyes or blurring a face is rarely effective and may even change the dynamic of the image (MACINTOSH, 2006). [18]
Another aspect of anonymity regarding participant-generated images is that other people may be featured in the photographs, with or without their knowledge. In many studies, participants are often given rigid instructions on what they are and are not allowed to photograph (O'BRIEN, 2013). For example, in a study on the deaf community, if participants wished to take photographs of people, they were encouraged to take photographs that represented the people they wanted to show. This was to avoid the need to impose complex guidelines and rules about gaining consent of people photographed. Protecting confidentiality was a prime concern identified by the IRB committee for my "Transition to School" project, because families often visit public spaces to support children's learning and development, such as museums, libraries and parks. Photographing children at sites where other children may appear in the picture creates some ethical concerns for the researcher and should be weighted carefully while assessing contextual norms (CLARK, PROSSER & WILES, 2010; PINK, 2007). For that reason, some researchers will not produce or disseminate photographs of individuals from whom they did not obtain individual consent (CLARK, 2012). [19]
Like other studies, I justified photographing individuals due to the fact that images were not "potentially damaging" and involved benign content (MEO, 2010). Other studies are routinely required by ethics committees to prohibit participants from photographing other people (e.g., RADLEY & TAYLOR, 2003). In fact, many studies disallow images capturing anyone's face. For my project, families were photographing school preparation activities, which meant people would be the foci of pictures. Prohibiting the inclusion of people in the photograph would compromise the data and my general goal for the study. [20]
Based on my research questions and essential needs of the study, I argued that the project involved minimal risk for research participants and would be negotiated with each participant (PAPADEMAS, 2009). PINK (2007) argues that it is difficult for a researcher to envision all of the potential and long-term risks for participants; however, as I brainstormed with colleagues the consequences appeared limited. As stated in the Common Rule (46.102)3), minimal risk means that the probability and magnitude of harm or discomfort anticipated in the research are not greater in and of themselves than those ordinarily encountered in daily life. Anytime an individual enters a public space, there is a general risk of appearing in the background of a picture. BANKS (2001) describes this as an intellectual resolution, rather than a legal one, since it is impractical to have participants serving as the ethical liaisons and collecting consent from anyone who might appear in a photograph. This project did not place the child of interest, family members or complete strangers in any greater harm or discomfort than would ordinarily occur. Further, the project was auto-driven and participants held complete control over what images were taken and eventually developed (CLARK, 1999). The process is much less invasive than researcher-produced photography, since I was not selecting who or what was photographed. [21]
2.5 Compromise: Photography with restrictions
After a series of negotiations, a compromise was finally reached. Based on IRB stipulations, photographs served only as prompts during photo elicitation interviews, and did not serve as data for analysis. I was not able to share the visual images beyond the research team, which limited their contributions to the study. Descriptions and interpretations of the photo elicitation process would benefit from pairing images with the text to provide the audience with more complete data. However, that was only permitted during the interview, and then became obsolete in our analysis and dissemination of findings. Family members and individuals on the research team were the only individuals permitted to view the photos once they were developed. Actual photographs could not be included in any publications or documents and were to be deleted from my computer using a secure delete program, once they were developed. [22]
The compromise was not ideal, but after months of struggling to gain approval, it still provided the opportunity to conduct my study. As I reviewed the consent form with each participant, many found the stipulations confusing and unnecessary. One mother was concerned that the images would be deleted, because she wanted more people to see them, and even suggested putting her son on a billboard. A father gave me a skeptical look and said: "Why?" I wanted to agree and say: "My thoughts exactly," but instead explained that the university requires its researchers to protect their confidentiality. Other studies document similar questions raised by participants and their desire to waive the ethical requirement (WILES et al., 2012). I was not pleased with the restrictions that framed the role of photographs in the study, but I decided to focus my energies on conducting the research rather than fighting the system. [23]
3. Approval to Disseminate Findings
Data collection and analysis took approximately one year, and then I entered the stage of writing and dissemination of findings. After gaining approval to conduct the study, I did not anticipate further resistance. Much like the IRB approval process, peer-reviewing has been questioned for decades on its reliability, adequacy, and fairness (PETERS & CECI, 1982). Nonetheless, the peer-review process is critical to preserving the dissemination of quality research. The majority of scholars report satisfaction with the current review system; however, a little over half contend that it is an obstacle (ROWLAND, 2002). I was naïve in forgetting that publishing academic writing is always met with some resistance, especially writing that involved an unfamiliar method to most. Below, I share some of key concerns raised by peer reviewers who assessed the credibility of my study. [24]
3.1 Peer-review critique: Were the pictures staged?
Several reviewers raised the same question—How do you know the families did not stage all of these activities? It was such a superficial yet pragmatic concern, that I was somewhat amused. Across all methods of inquiry, when individuals volunteer to participate in research and work with a researcher, there is the threat that it will alter or change their behavior (POPE & MAYS, 1995). Unquestionably, when participants take photographs, the process is inevitably influenced, to different degrees, by parents, friends and family, as well as by how they see the researcher and their purpose (SHARPLES, DAVISON, THOMAS & RUDMAN, 2003). They are also influenced by broader social norms about what is socially acceptable or desirable (HOLLIDAY, 2000). Essentially, each reviewer was raising the question—Are photographs a valid source of data for the families involved in the study? As with many studies in the social sciences, participants may be influenced by social desirability and providing responses or information that may be viewed favorably by the audience or researcher (ROSE, 2011). [25]
3.2 The images do not stand alone
I acknowledged to the reviewer that a vulnerability of this method is that families may stage activities or document events that are not a regular occurrence in the family. Reflexivity was considered in how my role as the lead researcher influenced the participants in the study (KUPER, LINGARD & LEVINSON, 2008). The follow-up interview was a time to discuss the routines and rituals in the photographs and how frequently they occurred. By having multiple members of a family present or by probing participants for information about the photographs, allowed for checks and balances of information provided (CLARK- IBANEZ, 2004; TWINE, 2006). The process of taking photographs was accompanied with other pieces of evidence in the form of interviews to clarify and contextualize the images (BANKS, 2001). It was less about the visual product, and more about how families explained and interpreted the imagery (ROSE, 2011). However, when data are participant-driven, researchers rely on the information they provide for the study, as part of the process. As a limitation to any research involving self-reporting, you accept that there may be some social desirability in responses and take efforts to triangulate the information provided. [26]
Although I fully believe visual images are an excellent source of information about the context and activities of participants' lives, I grappled with this issue during the research process and with the response to the reviewer. Photographs are a source of data, but should they be data in themselves? HILL (2013) raises this question in her photo-based work with children and families: "Are photos just fodder for photo elicitation in interviews? Or can new, participant-produced photos be treated as standalone artifacts which deserve interpretation and analysis techniques on their own?" (p.146) [27]
Since images require the audience and researcher to interpret the image, there are dangers in its use, as one's culture and experiences in life may shift its original intent. Using the photographs as one strand of evidence within a larger project, as evidence for vulnerabilities as well as for the overall story, is often relevant and important to the broader purpose (HARPER, 2002). Photographs used in the study held a dual purpose. I used the photographs as a tool to explore the contextualized lives of families, and simultaneously, participants employed photographs to provide a unique way to communicate activities in their everyday lives (CLARK- IBANEZ, 2004). However, the images were not sufficient on their own—they required the verbal explanations to fully gain meaning. [28]
There is one memory from this project that really stands out for me in regard to photographs standing alone as data. In the middle of data collection, I introduced some of the photos to undergraduates working on the Transition to School project (MILLER, 2014). I purposefully began with two very different sets of photographs. In one set, the mother was a former preschool teacher and the photographs reflected that training and lifestyle. There were many images of her daughter reading, practicing her letters by writing on a table of shaving cream, and visits to the library. Another family focused more on documenting different family members, interpersonal interactions, and everyday activities involving cooking and errands. When I asked the group to start discussing the differences between the two sets of photographs, one student said: "So, basically, the first family is preparing their child for school and the second family really isn't doing anything." This was the general consensus of the group until I distributed the transcripts attached to each set of photographs which explained each image. For example, one mother described her photograph taken at the grocery store and explained how much reading and math can occur in each aisle. Another family discussed the social skills being tested and developed as two sisters played house with a doll. After reading through the transcripts, the same student said: "Okay, so now I feel like a jerk." [29]
Sadly, this is an example of erroneous judgments that researchers and professionals make on a regular basis about the lives of individuals. Without allowing a participant to present their data and help interpret the data, information may be lost, underestimated or misunderstood. This is an example of a student who still had a great deal of training to do in qualitative inquiry and analysis. Unfortunately, similar attitudes exist by many individuals inside and outside of academia. It provides an argument for why images should not be viewed alone. It also helped me justify to the reviewer that the findings were not just a summary of photographs, but were an analysis of visual images that were orally explained and interpreted by each family. [30]
Whether images are participant-produced or researcher-produced, verbal explanations need to accompany visual images; and these were essential to my study. Researchers should exercise caution in not romanticizing or over-stating the capacity of a photograph. Researchers cannot completely rely on being able to independently organize and interpret images as participants desired, or allow photographs to speak for themselves (PHOENIX, 2010). If so, you risk misinterpretations or adulterations of a participant's message, which is already a ubiquitous problem with much research on children and families. [31]
3.3 Peer reviewers: Quantifying images
A second reviewer requested a table with the distribution and frequency of each type of image. The reviewer wanted further evidence that the themes I reported in my findings were in fact the most frequently displayed in participant photographs. For some visually driven studies that request a set number of photographs from each participant, a table may help organize and present visual data. In that respect, some kinds of analysis of visual material can be numerically represented (BELL, 2001). For example, an author can disclose how many photographs were taken in total or how many people or objects were included in images. Or, a researcher may instruct a participant to take a specific number of pictures, and therefore prioritize the types of moments or events they capture. The representations of those five or six pictures may yield important quantitative information. The more important issue for most visual projects is not simply what is depicted, but what is its meaning and significance? That is why it is important to see the interviews as linked to the meaning of the visual material (YATES, 2010). [32]
3.4 Response: Inadequate representation of findings
The reviewer comments exposed possible epistemological differences at play in reviewing and interpreting the manuscript, which is common in the review process (MALLARD, LAMONT & GUETZKOW, 2009). A frequency table for my study did not appropriately capture the information gained from families, nor did it represent the knowledge generated from the project. Since the photo-elicitation process aimed at uncovering the meaning and importance of photographs, it prioritizes the explanation that families offer for the images. Additionally, many families took a series of redundant pictures during an activity showing a high frequency, while other families only took one picture, but both families described it with equal importance. [33]
For photo-elicitation studies, "more" does not necessarily mean "more important." A photograph, or series of photographs, may make limited contributions to the study at all. For example, one family had several photos of the youngest child in the family. In the literature on children's socialization for school, this could signify the importance of siblings for children's school readiness. When I asked the mother to talk about those pictures, her response was quite simple:
"Sorry, this really doesn't have anything to do with preparing him [older son] for kindergarten. I just love the color of his [younger son's] eyes and wanted to know if they came through in the pictures. And look at that, they do!" [34]
Although the images of the brother were printed to prompt the family during the interview, they were of limited value to the study. Similarly, in some cases, participants became what one participant described as "camera happy" during an event activity, and took a large number of photos. During the interview, families clustered the almost identical images together, and talked about them as a collective group. Again, as demonstrated by families, more photos did not mean an activity, event or person was of greater value—they just happened to take more pictures. [35]
Conversely, there were times when just one image was of greatest importance to the family, even though multiple pictures were taken of other activities and objects. While reviewing the photographs, I asked the mother about a woman sitting with her son. She said: "That is my sister. Honestly, she is the one who raises him. I am working and in school right now, so she does everything." The mother proceeded to explain the influence of her sister and her impact on preparing the son for kindergarten. This insight was not visually represented across her photographs, but was explicated by the mother. It is difficult to numerically communicate that to a research audience through numbers, unless the images are somehow weighted or ranked, which conflicts with the vision of the project. [36]
There were also portions of interviews that were not represented in the photographs, but emerged as important data in the interview. For example, one mother discussed a cluster of photographs showing her son using a pen and paper. She said:
"So, here he is just messing around with paper and stuff. He does this a lot and I see some of the letters he is learning at his school [early childhood center]. Really, one of the biggest things that helps him for school is going to school in the mornings. I couldn't take pictures there because they wouldn't let me, but it is really good for him. He is with kids his age and they really know how to teach the basics. They know exactly what he needs." [37]
She proceeded to discuss the role of her son's early childhood center in supporting her son's development, as well as supporting her work as a mother. This is an example of how the PEI structure allows families to reflect on related, but indirect associations with the images they have produced (CLARK- IBANEZ, 2004). Even if a moment, location, or person is not in a photograph, the participants can move beyond the printed images to supplement what was not captured on camera. Important data may not be present in a frequency count of the pictures, but is part of the visual and verbal combination. [38]
By allowing multiple family members to participate in the interview sessions, it is possible for an image to have more than one meaning (BROOKS & WANGMO, 2011). A combination of family members invites a combination of perspectives and allows for multiple explanations of one image (SCHWARTZ, 1989). The varying perspectives complicate the data in a beneficial and rich manner (CLARK- IBANEZ, 2004). The tensions or differing interpretations expand a researcher's understanding of the phenomenon or topic of interest. How can you apply one label to an image, when it can have an infinite number of meanings? This notion can be a scary enterprise for a scholar of is number-minded; however, this type of qualitative evidence needs to be communicated to the larger research community (DENZIN, 2009). It requires an epistemological negotiation with reviewers who are numbers-based in their approach to making sense of the world (MALLARD et al., 2009). I believe most reviewers are fully capable of repositioning themselves epistemologically to adequately and fairly evaluate research—they may just need some redirection and guidance to get there. [39]
Scholarly publications involving visual methods focus on the themes or ideas that emerge from their research study. Rarely do scholars disclose the challenges encountered during the research process that are linked to their choice of method, and omit the process and stipulations imposed by ethics committees and peer reviewers. This article offers the account of one photo elicitation research study, with the hopes to start a wider conversation about how to respond to critics within and beyond the institution. Although each visual study is unique, with its own set of challenges and rewards, this article offers some common obstacles that scholars can anticipate and ideas for how to defend visual work as the conservative climate of research approval can inhibit visual methods (PITT, 2014; WILES et al., 2012). [40]
The way in which ethical regulation is inappropriately applied to visual research is dissuading this line of research and limiting the manner in which photography can lead or supplement a study (WILES et al., 2012). As enthusiasm for visual methods continues to mount, advocacy needs to increase. It is not enough to simply "manage" the process of gaining approval or becoming compliant with regulations, but to promote equity for visually based work. WILES et al. (2011) write:
"The future development of ethical visual research is best served by proactive practitioners able to inform, educate, debate, and generally contribute to the effective functioning of ethical committees. Visual researchers cannot afford to sit on the sidelines when ethics are debated, but should think through and argue their ethical position" (p.692). [41]
On a positive note, the federal government is considering changes to the Common Rule for two reasons: 1. the human subject research landscape has changed dramatically since the early 1980s when the current regulations were first being formulated, and 2. in light of that, there is a need to address effectiveness and the efficiency of the regulations for human subject protections in the current research environment. There is common agreement that the Common Rule needs to be modernized to more appropriately reflect the changing needs and nature of research studies (EVANS, 2013). The difficulty is allowing for progressive and beneficial studies, while still preserving the protection of participants. Much of the focus is on modifications for the biomedical field, rather than the social sciences. It remains unclear if any changes will simplify or challenge the work of visual research. Nonetheless, researchers must find ways to minimize the negative impact of the review process on research design. [42]
Although ethics committees set the boundaries for the inclusion and dissemination of visual images in research, there is a range of ethical considerations beyond those determined at the institutional level (PAPADEMAS, 2009; PITT, 2014; WILES et al., 2011). The ethical questions researchers should consider may in fact not be the legal guidelines the IRB committee members focus on in reviewing protocols. Once researchers gain approval, researchers may ceremoniously dive into their research and neglect to revisit ever-emerging consequential issues that impact participants and the group represented. This social and professional responsibility is also relevant to the peer review process when reviewers call for manuscripts changes that may impact how your participants are presented and viewed by an audience. [43]
Giving cameras to individuals is one way of soliciting their stories, their perspectives, and providing visual evidence that might transfer that story into other contexts (CLARK- IBANEZ, 2004). However, how that "voice" is produced, whose voice it represents, and how the product of the research is used and interpreted are all critical issues for researchers using participatory visual methods (YATES, 2010). These are the real questions that are often not identified by ethics committees and reviewers, but are essential to studies involving visual methods in a "wider framework of situated ethics" (CLARK, 2013, p.78). Essentially, after defending your work to IRB and peer review critics, you must remain a critic of your own work throughout the full visual methods process. [44]
1) A high school English club is an extra-curricular group that serves as a social network for students who share an interest in the subject of English. <back>
2) This federal policy was created to protect human subjects and was published in 1991 by the United States government. It outlines the basic ethical requirements for any research involving human subjects across all departments. <back>
3) The federal regulation defines minimal risk as follows in Section 46.102: Minimal risk means that the probability and magnitude of harm or discomfort anticipated in the research are not greater in and of themselves than those ordinarily encountered in daily life or during the performance of routine physical or psychological examinations or tests (see http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html [Accessed: February 23, 2015]). <back>
Banks, Marcus (2001). Visual methods in social research. London: Sage.
Bell, Philip (2001). Content analysis of visual images. In Theo Van Leeuwen & Carey Jewitt (Eds.), Handbook of visual analysis (pp.10-34). Thousand Oaks, CA: Sage.
Bessell, Sharon (2009). Research with children: Thinking about method and methodology. Australian Research Alliance for Young Children, http://www.aracy.org.au/publications-resources/command/download_file/id/108/filename/Involving_children_and_young_people_in_research.pdf [Accessed: April 14, 2015].
Brooks, Margaret & Wangmo, Tshering (2011). Introducing the project approach and use of visual representation to early childhood education in Bhutan. Early Childhood Research & Practice, 13(1), 1-35.
Clark, Andrew (2012). Visual ethics in a contemporary landscape. In Sarah Pink (Ed.), Advances in visual methodology (pp.17-35). London: Sage.
Clark, Andrew (2013). Haunted by images? Ethical moments and anxieties in visual research. Methodological Innovations Online, 8(2), 68-81.
Clark, Andrew; Prosser, Jon & Wiles, Rose (2010). Ethical issues in image-based research. Arts & Health, 2(1), 81-93.
Clark, Cindy Dell (1999). The autodriven interview: A photographic viewfinder into children's experiences. Visual Sociology, 14, 39-50.
Clark-Ibanez, Marisol (2004). Framing the social world with photo-elicitation interviews. American Behavioral Scientist, 47, 1507-1527.
Collier, John (1967). Visual anthropology: Photography as a research method. New York: Holt, Rinehart and Winston.
Corrigan, Oonagh & Tutton, Richard (2006). What's in a name? Subjects, volunteers, participants and activists in clinical research. Clinical Ethics, 1(2), 101-104.
Denzin, Norman K. (2009). The elephant in the living room: Or extending the conversation about politics of evidence. Qualitative Research, 9(2), 139-160.
Edwards, Sarah; Ashcroft, Richard & Kirchin, Simon (2004). Research ethics committees: Differences and moral judgment. Bioethics, 18(5), 408-427.
Evans, Barbara (2013). Why the Common Rule is hard to amend. University of Houston Law Center, No. 2012-W-4, http://dx.doi.org/10.2139/ssrn.2183701 [Accessed: March 26, 2015].
Harper, Douglas (2002). Talking about pictures: A case for photo elicitation. Visual Studies, 17(1), 13-26.
Hill, Joanne (2013). Using participatory and visual methods to address power and identity in research with young people. Graduate Journal of Social Science, 10(2), 132-151.
Holliday, Ruth (2000). We've been framed: Visualising methodology. Sociological Review, 48(4), 503-521.
Hurworth, Rosalind; Clark, Eileen; Martin, Jenepher & Thomsen, Steve (2003). The use of photo-interviewing: Three examples from health evaluation and research. Evaluation Journal of Australasia, 4(1/2), 52-62.
Jenkings, Neil; Woodward, Rachel & Winter, Trish (2008). The emergent production of analysis in photo elicitation: Pictures of military identity. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 9(3), Art. 30, http://nbn-resolving.de/urn:nbn:de:0114-fqs0803309 [Accessed: April 14, 2015].
Kuper, Ayelet; Lingard, Lorelei & Levinson, Wendy (2008). Critically appraising qualitative research. British Medical Journal, 337, 687-692.
Lidz, Charles; Appelbaum, Paul Stuart; Arnold, Robert; Candillis, Philip; Gardner, William; Myers, Suzanne & Simon, Lorna (2012). How closely do institutional review boards follow the Common Rule? Academic Medicine, 87(7), 969-974.
Lincoln, Yvonna S. (2008). Institutional review boards and methodological conservatism: The challenge to and from phenomenological paradigms. In Norman K. Denzin & Yvonna.S. Lincoln (Eds.), The landscape of qualitative research (3rd ed., pp.221-243). Thousand Oaks, CA: Sage.
Macintosh, Tracy (2006). Ethical considerations for clinical photography in the global south. Developing World Bioethics, 6(2), 81-88.
Mallard, Gregoire; Lamont, Michele & Guetzkow, Joshua (2009). Fairness as appropriateness: Negotiating epistemological differences in peer review. Science, Technology & Human Values, 34(5), 573-606.
Meo, Analia Ines (2010). Picturing students' habitus: The advantages and limitations of photo elicitation interviewing in a qualitative study in the city of Buenos Aires. International Journal of Qualitative Methods, 9(2), 149-171, https://ejournals.library.ualberta.ca/index.php/IJQM/article/view/6682/7026 [Accessed: March 26, 2015].
Miller, Kyle (2014). Learning about children's school preparation through photographs: The use of photo-elicitation interviews with low-income families. Journal of Early Childhood Research, 13(4), OnlineFirst, http://ecr.sagepub.com/content/early/2014/12/16/1476718X14555703.abstract [Accessed: April 22, 2015].
O'Brien, Dai (2013). Visual research with young deaf people—An investigation of the transitional experiences of deaf young people from mainstream schools using auto-driven photo-elicitation interviews. Graduate Journal of Social Science, 10(2), 152-175.
Pain, Helen (2011). Visual methods in practice and research: A review of empirical support. International Journal of Therapy and Rehabilitation, 18(6), 343-350.
Pain, Helen (2012). A literature review to evaluate the choice and use of visual methods. International Journal of Qualitative Methods, 11(4), 303-319. https://ejournals.library.ualberta.ca/index.php/IJQM/article/view/10397/14340 [Accessed: March 18, 2015].
Papademas, Diana (2009). IVSA code of ethics and guidelines. Visual Studies, 24(3), 250-257.
Peters, Douglas & Ceci, Stephen (1982). Peer-review practices of psychological journals: The fate of published articles, submitted again. Behavioral and Brain Sciences, 5(02), 187-195.
Phoenix, Cassandra (2010). Seeing the world of physical culture: The potential of visual methods for qualitative research in sport and exercise. Qualitative Research in Sport and Exercise, 2(2), 93-108.
Pink, Sarah (2007). Visual interventions: Applied visual anthropology. Oxford: Berghahn Books.
Pitt, Penelope (2014). "The project cannot be approved in its current form": Feminist visual research meets the human research ethics committee. Australian Educational Researcher, 41, 311 -325.
Pope, Catherine & Mays, Nick (1995). Qualitative research: Reaching the parts other methods cannot reach: An introduction to qualitative methods in health and health services research. British Medical Journal, 311(6996), 42-45.
Powell, Kimberley & Serriere, Stephanie (2013). Image-based participatory pedagogies: Reimagining social justice. International Journal of Education and the Arts, 14(15), 1-27.
Prosser, Jon & Loxley, Andrew (2007). Enhancing the contribution of visual methods to inclusive education. Journal of Research in Special Educational Needs, 7(1), 55-68.
Radley, Alan. & Taylor, Diane (2003).Images of recovery: A photo-elicitation study of a hospital ward. Qualitative Health Research, 13(1), 77-99.
Redshaw, Margaret E.; Harris, Adrian L. & Baum, J.D. (1996). Research ethics committee audit: Differences between committees. Journal of Medical Ethics, 22(2), 78-82.
Rose, Gillian (2011). Visual methodologies: An introduction to researching with visual materials. London: Sage.
Rowe, Jeremy (2011). Visual research ethics at the crossroads. In Eric Margolis & Luc Pauwels (Eds.), The Sage handbook of visual research methods (pp.707-722). London: Sage.
Rowland, Fytton (2002). The peer-review process. Learned Publishing, 15(4), 247-258.
Schwartz, Dona (1989). Visual ethnography: Using photography in qualitative research. Qualitative Sociology, 12(2), 119-153.
Shah, Seema; Whittle, Amy; Wilfond, Benjamin; Gensler, Gary & Wendler, David (2004). How do institutional review boards apply the federal risk and benefit standards for pediatric research? Jama, 291(4), 476-482.
Sharples, Mike; Davison, Laura; Thomas, Glyn & Rudman, Paul (2003). Children as photographers: An analysis of children's photographic behaviour and intentions at three age levels. Visual Communication, 2(3), 303-330.
Twine, France (2006). Visual ethnography and racial theory: Family photographs as archives of interracial intimacies. Ethnic and Racial Studies, 29(3), 487-511.
Wiles, Rose; Clark, Andrew & Prosser, Jon (2011). Visual research ethics at the crossroads. In Eric Margolis & Luc Pauwels (Eds.), The Sage handbook of visual research methods (pp.685-706). London: Sage.
Wiles, Rose; Coffey, Amanda; Robinson, Judy & Heath, Sue (2012). Anonymisation and visual images: Issues of respect, "voice" and protection. International Journal of Social Research Methodology, 15(1), 41-53.
Wiles, Rose; Prosser, Jon; Bagnoli, Anna; Clark, Andrew; Davies, Katherine; Holland, Sally & Renold, Emma (2008). ESRC National Centre for Research Methods review paper: Visual ethics: Ethical issues in visual research. National Centre for Research Methods, NCRM/011, http://eprints.ncrm.ac.uk/421/1/MethodsReviewPaperNCRM-011.pdf [Accessed: April 21, 2015].
Yates, Lyn (2010). The story they want to tell, and the visual story as evidence: Young people, research authority and research purposes in the education and health domains. Visual Studies, 25(3), 280-291.
Dr. Kyle MILLER is an assistant professor of child development in the College of Education at Illinois State University in the United States. Her research focuses on qualitative research with families from lower-income backgrounds and the use of visual images as a research tool. Dr. MILLER is a former urban teacher and now serves as an instructor for courses in child development, issues of diversity in education, foundations of elementary education and graduate courses in research methods.
Contact:
Kyle Miller, PhD
School of Teaching and Learning
Illinois State University
237 DeGarmo, Box 5330
Normal, IL 61711, USA
Tel.: +1 309-438-1527
E-mail: kemille@ilstu.edu
URL: http://education.illinoisstate.edu/faculty_staff/biography.php?ulid=kemille
Miller, Kyle (2015). Dear Critics: Addressing Concerns and Justifying the Benefits of Photography as a Research Method [44
paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 16(3), Art. 27,
http://nbn-resolving.de/urn:nbn:de:0114-fqs1503274.