Evaluation: Difference between revisions
mNo edit summary |
mNo edit summary |
||
Line 17: | Line 17: | ||
*[[Plaisant, Catherine|Catherine Plaisant]]: [http://hcil.cs.umd.edu/trs/2004-19/2004-19.pdf "The challenge of information visualization evaluation"]. AVI 2004: 109-116 | *[[Plaisant, Catherine|Catherine Plaisant]]: [http://hcil.cs.umd.edu/trs/2004-19/2004-19.pdf "The challenge of information visualization evaluation"]. AVI 2004: 109-116 | ||
*A list of papers dealing with evaluation for InfoVis: http://www.dis.uniroma1.it/~beliv06/infovis-eval.html | *A list of papers dealing with evaluation for InfoVis: http://www.dis.uniroma1.it/~beliv06/infovis-eval.html | ||
*Anita Komlodi et al.: [http://www.research.umbc.edu/~komlodi/IV_evaluation_bibliography_Sep2005.pdf "Information Visualization Evaluation Review Bibliography"] with over 200 entries. | *[[Komlodi, Anita|Anita Komlodi]] et al.: [http://www.research.umbc.edu/~komlodi/IV_evaluation_bibliography_Sep2005.pdf "Information Visualization Evaluation Review Bibliography"] with over 200 entries. | ||
[[Category: Glossary]] | [[Category: Glossary]] |
Revision as of 11:26, 15 March 2006
Evaluating InfoVis: Been There, Done That?
The usefulness of an InfoVis tool is not as predictable as with ‘classic’ software problems, because of the remarkable influence of human reasoning processes on success in application. So even after participative design and faithful development the outcome has to be evaluated to a high extent.
Usability not only matters but may become vital due to the interactive and explorative nature of many tasks users will perform. Therefore on the one hand, one has to pay particular attention to usability questions in an iterative design process. On the other hand, a severe examination is also essential for the assessment of the InfoVis technique because of its interdependency with usability.
Ecological validity and external validity: looking for the gemstones.
As necessary it is to conduct ‘classic’ controlled experiments as essential is it to break out of laboratories and do some kind of field observation.In many cases this is the one and only way to evaluate the usefulness for the ‘real world’ and ensure ecological validity. The same applies to the question of generalization. Sometimes only other populations are able to decide for themselves wether a technique makes
sense in their setting and for their data and tasks and thus allowing profound assessment of external validity.
Related Pages
2005-09-30: Evaluating Visualizations: Do Expert Reviews Work?
Useful Resources
- Catherine Plaisant: "The challenge of information visualization evaluation". AVI 2004: 109-116
- A list of papers dealing with evaluation for InfoVis: http://www.dis.uniroma1.it/~beliv06/infovis-eval.html
- Anita Komlodi et al.: "Information Visualization Evaluation Review Bibliography" with over 200 entries.