In this article, we propose a new metaphorical framework for understanding and using mixed methods evaluation. Our re-conceptualization of mixing methods provides important insights into the theoretical and practical challenges of using a mixed method approach. Mixed methods can create philosophical and practical dilemmas in the ways data are collected, analyzed, interpreted and reported. We used an archipelago as a metaphor for resolving these challenges. An archipelago is a set of islands that loosely forms a group. Using the archipelago as a metaphor helped to clarify and re-conceptualize the evaluation approach and its findings by allowing simultaneous consideration of different mixed methods and stances. The results have implications for those attempting to use mixed methods to evaluate programs.
Bibliographical noteFunding Information:
Teacher enhancement and curriculum development projects have been funded by the National Science Foundation for many years and there is strong evidence that science teachers are in need of this continuing, supplementary education ( Weiss, 1997 ). Although most people believe that the teacher enhancement programs are beneficial, there have been increasing calls for more careful accountability of the outcomes ( NSF, 1993 ). Very few evaluations have documented the actual effect of enhancement on teachers’ classroom behavior or student outcomes ( NSF, 1996 ). Even when outcome data are gathered, usually only one method is employed in spite of the advantages in using of a variety of data collection methods ( Patton, 1990; Shadish, 1993 ). We had the opportunity to conduct a long-term evaluation of a science education reform, which employed mixed methods. We believe that the approach and the description we provide here of that evaluation raise valuable issues regarding mixed methods evaluation.
Our evaluation of the project was designed to fit into a politically charged environment. When we were contracted to conduct the comprehensive evaluation of SS&C, the NRC Science Standards were just being published. The NSTA and the Association for the Advancement of Science (AAAS) had both been involved in the development of guidance for science education reform prior to the NRC receiving funding from the National Science Foundation to develop the Standards as a neutral party. It was crucial that our evaluation provide information that was persuasive and rigorous enough to convince parties on both sides of the fence. The NSF and AAAS are both traditional scientific communities with long histories of using more “logical-positivistic” research approaches. On the other hand, we knew we needed rich descriptive information to fully describe the impact of a program of this magnitude. Consequently, mixed methods were an obvious choice. The question was, exactly how should we mix the methods?