Jump to ContentJump to Main Navigation
Appealing to Justice$

Kitty Calavita and Valerie Jenness

Print publication date: 2014

Print ISBN-13: 9780520284173

Published to California Scholarship Online: May 2015

DOI: 10.1525/california/9780520284173.001.0001

Show Summary Details
Page of

PRINTED FROM CALIFORNIA SCHOLARSHIP ONLINE (www.california.universitypressscholarship.com). (c) Copyright University of California Press, 2017. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a monograph in CALSO for personal use (for details see http://california.universitypressscholarship.com/page/535/privacy-policy-and-legal-notice). Subscriber: null; date: 30 March 2017

(p.195) Appendix A Procedures for Interviews with Prisoners

(p.195) Appendix A Procedures for Interviews with Prisoners

Appealing to Justice
University of California Press

Before any interviewing began, we engaged in extensive training with our research team to address procedures for obtaining informed consent, maintaining confidentiality, and responding to the kind of special circumstances that researchers encounter when conducting research in prison settings, as well as standard interviewing techniques and probing strategies. Above all, we aimed to reduce interviewer bias and ensure that all subjects were approached in a similar fashion to participate in the study, were asked standard questions in similar ways, and—as much as possible in this most coercive of settings—did not feel required to participate.

Prisoner interviewing began in July 2009 and ended in late September 2009. During this time, members of the interview team traveled to each of the three prisons and completed 120 face-to-face interviews with prisoners, including some who were confined to administrative segregation (“Ad Seg,” or what prisoners call “the hole”). The two coauthors did the majority (60 percent) of these prisoner interviews. Once these interviews were complete, the coauthors returned to these same prisons and interviewed CDCR staff who participate in processing inmate grievances—including all wardens, deputy wardens, and appeals coordinators, as well as two randomly chosen captains in each prison. Finally, we traveled to Sacramento to interview key administrative officials and randomly selected examiners in the Inmate Appeals Branch, which conducts the final review of grievances.

For the prisoner interviews, we made every effort to secure a random sample and ensure that officials did not interfere with sample selection either on purpose or inadvertently. About ten days prior to the first day of interviewing at a particular prison, we requested from the CDCR Office of Research the official roster that identified every person housed in the prison by name and CDCR number. From this roster, we used Statistical Package for the Social Sciences (p.196) (SPSS) to randomly select fifty inmates to serve as potential study participants, with the goal of interviewing the first forty (the other ten would serve as backup candidates for interviews in case of denials or when a prisoner in our original sample had been transferred or released by the time we arrived to conduct interviews). We then sent the list to our liaison at the prison, typically the public information officer or another lieutenant, so that the first forty prisoners could be notified by a ducat that they were invited to meet with an interviewer. This process resulted in a 93 percent participation rate, an unusually high rate of participation in any research setting, and one that may have been enhanced by our “outsider” status, as discussed in the introduction.

Two practices were key to protecting the integrity of random selection, guaranteeing prisoner privacy and confidentiality, and ensuring that participation was as voluntary as possible in a prison setting. First, ducats stated the appointment was for “research on prison life” or “interview,” and officers were asked not to discuss the research with inmates. Although we briefed upper-level prison administrators on why the research team was at the facility and why they were there for multiple days, we made an effort to keep rank-and-file officers, especially those who escorted prisoners to and from interviews, unaware of the purpose of the study. We did so to minimize the degree to which they could intentionally or unintentionally contaminate the field or otherwise undermine the research. Despite our best efforts, on occasion it was clear that rank-and-file officers had a (in some cases erroneous) sense of why we were there; we did not confirm or deny their assumptions. Second, if an inmate could not be scheduled for the interview—for example, because he had been paroled, transferred, or was in the hospital—the CDCR staff was instructed to note the reason and then proceed to the next randomly selected person on the list until all interview appointments were filled. A few people who were scheduled for interviews did not show up for the meeting, in which case they were given the opportunity to be interviewed on a subsequent day. They occasionally had another appointment at the same time (e. g., a medical appointment or a work assignment), and sometimes we were told that an inmate chose not to leave his cell.

These one-on-one interviews were conducted in strictly confidential settings where only the interviewer and interviewee were present and they could not be overheard. The location varied depending on available space. For example, we interviewed in correctional counselors’ offices, staff lunch rooms, chapels, and visiting rooms, as well as conference rooms and what appeared to be custodial closets. Once inmate respondents had given their informed consent to be interviewed,1 we asked their permission to record the interview—with the understanding that the recorder could be turned off at any time—and 91 percent of them agreed to be recorded.

The interviewer asked prisoners both closed and open-ended questions about housing arrangements, daily prison life, problematic or bothersome conditions, perceptions of the inmate appeals process, involvement with the appeals process, assessments of the legitimacy of the appeals process, perceived fairness of the criminal justice system, and recommendations for improving the grievance system. After a few initial rapport-building questions, we asked about any problems in prison and how they dealt with those problems. Later in the interview, (p.197) we asked if they had filed any grievances and, if they had, what they were about. At this point, we inquired specifically about and completed an “incident form” that we developed to capture data on specific instances of grievance filing. The incident form we created for the purpose of this study focused on (1) a grievance the respondent had filed that was resolved informally, (2) the most recent grievance he had filed, (3) the most important grievance he had filed, and (4) a grievance he had filed, if any, that had been granted. In addition, at the end of the interview, we read a series of statements (e. g.,, “The inmate grievance process works pretty well,” “Staff retaliate against inmates who file,” etc.), to which we asked respondents to “strongly agree, agree, disagree, strongly disagree, or neither agree nor disagree.” The interview instrument was semistructured in format and included follow-up questions that led to free-flowing exchanges. This allowed the prisoners, who are often restricted from speaking with outsiders, to share their personal stories and have their voices heard. The average interview length was slightly over one hour, with the shortest interview lasting just under half an hour, and some interviews extending well over two hours.

As a final step, we concatenated official data from the CDCR’s database on inmates (OBIS) to the interview data. Because privacy concerns required that the identities of participants be kept confidential, we requested and received central file information on eighty prisoners from each of the prisons (our sample of forty per prison, plus forty “decoys”).

The coding process began with the authors taking notes during the data collection period to track emergent ideas and themes. Once prisoner interviewing was complete, we read through a subset of transcribed interviews to develop preliminary coding categories designed to capture, among other things, (1) what prisoners identified as problems; (2) how they made sense of those problems, including attributions of blame for problems associated with prison life; (3) what grievances they had filed; and (4) whether they felt fairly treated by the criminal justice system. A team of four advanced graduate students then did “focused coding” (Charmaz 2006) of all the transcribed interviews—coding that was subjected to multiple checks by members of the research team. These coded data, coupled with the quantitative data from our interviews, concatenated official data, and research notes, form the basis for our analyses and arguments relating to the extent and nature of prisoner grievance filing, its meaning to the prisoner appellants, how and why it differs from disputing patterns in other stigmatized and vulnerable populations, and how to make sense of these differences.


(1.) People were not interviewed unless they gave their informed consent, as required by our Institutional Review Board. Moreover, they were not required to answer any question they chose not to answer and were allowed to discontinue the interview at any time. Recognizing that our interviewees lived in carceral environments, the research design and attendant logistics were organized to ensure that we did not in any way signal to people that they were required to participate, nor did we promise anything in exchange for participating. (p.222)