1. Be Clear About the Questions
The evaluation questions drive the study. If they are Questions ambiguous or not suited to the users’ needs, even a well-implemented method will produce findings of doubtful value. To be clear about the questions means to state them as specifically as possible so that the answers will be useful to decisionmakers. One exception to this rule—probably the only exception—is when the main purpose of the study is for evaluators to learn systematically about a substantive area in preparation for doing a main study. When this is the goal, the findings may not be directly useful to decisionmakers, but they should be a stepping stone to subsequent studies designed to serve policy needs.
2. Consider the Broad Options
Content analysis is only one approach to drawing conclusions from textual data. Other options that allow for the retrieval and manipulation of actual segments of text are briefly discussed in appendix I. The textual methods referred to there may be better suited to answering some evaluation questions than content analysis.
3. Define the Variables Carefully
The need for careful definitions of the variables, including the specification of their categories, cannot be overstated. Pitfalls abound: defining variables that cannot be used to answer the evaluation questions, defining variables that are so ambiguous as to defy reasonable categorization and interpretation, specifying categories that are not mutually exclusive and exhaustive, and specifying categories ambiguously so that coders can work only capriciously. Faulty definition is one of the main contributors to unreliability in the coding process.
Defining the variables should begin early because the definition may require a restatement of the evaluation questions. The possibility of redefinition should extend into the implementation phase, because training coders constitutes a test of the categories and may reveal problems in making the connection between the variables’ definitions and the assignment of codes.
4. Define Recording Units Carefully
The selection of recording units is based upon the nature of the variables and the textual material to be coded. For a given variable, different recording units can produce different findings. Therefore, considerable thought must go into the decision on recording units. Later, the coders must understand the recording units and apply them in a way such that the reliability of the coding process is maintained. When the recording units have obvious physical boundaries, as whole text, paragraphs, and words do, the coder’s task is relatively easy. When the theme is a recording unit, as it often is in an evaluation, extra precautions must be taken to avoid unreliability.
5. Develop an Analysis Plan
The steps in content analysis are deceptively simple and may therefore tempt the evaluator to postpone serious thought about data analysis until coding has been completed. This would be a mistake. In designing and implementing a content analysis, evaluators will come to several decisions that bear on whether the analysis will be possible. These decisions most notably, defining the variables, defining the recording units, and choosing the software—should not be made until after a preliminary data analysis plan has been developed. Otherwise, the evaluator may arrive at the time for data analysis and find some important options foreclosed.
6. Plan for Sufficient Staff and Time
Content analysis can be time-consuming. A coding manual must be prepared and, probably, revised several times. Coders must be trained and given time to practice coding until their reliability is satisfactory. These two steps alone can easily take a couple of months. The time required for the final coding process depends upon the amount of material to be coded, the number of variables, the number of coders, and the judgment required for coding decisions. Careful definition of variables will help keep the need for judgment to a minimum but, in most analyses, some variables will be complex and subtle and coding decisions will take time.
Source: GAO (2013), Content Analysis: A Methodology for Structuring and Analyzing Written Material: PEMD-10.3.1, BiblioGov.