Visual Analytics

From InfoVis:Wiki
Jump to navigation Jump to search
Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces. [Thomas and Cook, 2005]
People use visual analytics tools and techniques to synthesize information and derive insight from massive, dynamic, ambiguous, and often conflicting data; detect the expected and discover the unexpected; provide timely, defensible, and understandable assessments; and communicate assessment effectively for action.

Visual analytics is a multidisciplinary field that includes the following focus areas:

  • analytical reasoning techniques that let users obtain deep insights that directly support assessment, planning, and decision making;
  • visual representations and interaction techniques that exploit the human eye’s broad bandwidth pathway into the mind to let users see, explore, and understand large amounts of information simultaneously;
  • data representations and transformations that convert all types of conflicting and dynamic data in ways that support visualization and analysis; and
  • techniques to support production, presentation, and dissemination of analytical results to communicate information in the appropriate context to a variety of audiences.
[Thomas and Cook, 2005, 2006]

Visual analytics as a highly interdisciplinary field of research.
[Keim et al., 2006]
Visual analytics is more than just visualization and can rather be seen as an integrated approach combining visualization, human factors and data analysis. ... With respect to the field of visualization, visual analytics integrates methodology from information analytics, geospatial analytics, and scientific analytics. Especially human factors (e.g., interaction, cognition, perception, collaboration, presentation, and dissemination) play a key role in the communication between human and computer, as well as in the decisionmaking process.
[Keim et al., 2006]

Visual analytics is the formation of abstract visual metaphors in combination with a human information discourse (usually some form of interaction) that enables detection of the expected and discovery of the unexpected within massive, dynamically changing information spaces. It is an outgrowth of the fields of scientific and information visualization but includes technologies from many other fields, including knowledge management, statistical analysis, cognitive science, decision science, and others.

This marriage of computation, visual representation, and interactive thinking supports intensive analysis. The goal is not only to permit users to detect expected events, such as might be predicted by models, but also to help users discover the unexpected—the surprising anomalies, changes, patterns, and relationships that are then examined and assessed to develop new insight.
[Cook et al., 2007]

Related Links

Visual Analytics @ YouTube

Basic Literature



Hi BobI don't think anybody (who knew of them in the first place) will have foetgtorn standards bodies like W3C or OASIS. Indeed for those of who work with XML, the W3C is of course the central source of most of the key specifications.Surely though quality is not an automatic facet of any particular body's work, but varies according to many factors: the people, the time, the politics, etc. So while W3C has given us some great technologies (XML 1.0, XSLT, MathML, and SVG to name but four) it has also given us some stinkers (e.g. XML Schema, the whole WS-* stack, and XML 1.1).I think it's interesting that often the blame for stinkiness can be traced squarely back to vendor influence. To take one tiny example: why did the W3C decide to count the (previously forbidden) NEL character as a line feed in XML 1.1, other than for reasons of compatibility with legacy IBM systems which (practically alone of their competitors) made use of this character? This was one of the disastrous moves that made such XML 1.1 instances incompatible with the entire installed base of XML 1.0 processors.XML 1.0 (which I think of as a clean, well-written spec) has attracted over 200 errata in its lifetime. At around 40 pages that's 5 errors per page. Do you think certain recent high-profile ISO/IEC standards are significantly more faulty than that?When you mention procurement, I take it you mean the procurement by nations. The major factor here is surely that nations lean towards international standards because they are international, not necessarily because of perceptions of superior quality. Being international means that they (the nations) ultimately can control the standardisation process. Vendor-driven consortiums perform a different function and are valued at a lesser worth accordingly: it's not technical, it's political.And if laws are to be re-visited and standards bodies judged, who is going to be doing the re-visiting and the judging? Ultimately it is a precept of international standardisation that the sovereign nations order their own affairs and yes sometimes this means vendors get upset. Ultimately we (the users) need the nations as they are the only entities powerful enough to bring today's huge corporations to heel. - Alex.