2005-09-30: Evaluating Visualizations: Do Expert Reviews Work?

From InfoVis:Wiki
Revision as of 11:22, 6 September 2007 by Iwolf (talk | contribs) (Reverted edits by ThmV5n (Talk); changed back to last version by Iwolf)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

The article Evaluating Visualizations: Do Expert Reviews Work? by M. Tory and T. Moller appeared in IEEE Computer Graphics and Applications:

Abstract[edit]

Visualization research generates beautiful images and impressive interactive systems. Such developments make fascinating demos, but how do we know if they are actually useful for real people doing real tasks? If the interaction is awkward or we have not carefully considered users' needs, even the most well-intentioned and technically developed visual display will be ineffective. Emphasis on evaluating visualizations is growing. User studies of perceptual phenomena related to visualization and comparisons of visualization tools are becoming hot topics in the visualization literature. But, along the way, researchers are discovering that user study design is rarely straightforward.
[Tory and Moller, 2005]



References[edit]

[Tory and Moller, 2005] M. Tory and T. Moller, Evaluating Visualizations: Do Expert Reviews Work?, IEEE Computer Graphics and Applications, 25(5):8-11, Sept.-Oct. 2005.