In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evalu- ation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustra- tion, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization in- terfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and sum- mative evaluations.
|Original language||English (US)|
|Title of host publication||Proceedings of the 2012 Workshop on Beyond Time and Errors - Novel Evaluation Methods for Visualization, BELIV 2012|
|State||Published - Dec 1 2012|
|Event||2012 4th Workshop on Beyond Time and Errors - Novel Evaluation Methods for Visualization, BELIV 2012 - Seattle, WA, United States|
Duration: Oct 14 2012 → Oct 15 2012
|Other||2012 4th Workshop on Beyond Time and Errors - Novel Evaluation Methods for Visualization, BELIV 2012|
|Period||10/14/12 → 10/15/12|