Ch 1/2/3: Intro, Data, Tasks Paper: Design Study Methodology

Tamara Munzner Department of Computer Science University of British Columbia

CPSC 547, Week 2: 17 September 2019 http://www.cs.ubc.ca/~tmm/courses/547-19 News • Signup sheet round 2: check column (or add yourself) • Canvas comments/question discussion –one question/comment per reading required • everybody got this right, great! –responses to others required • a few of you did not do this • original requirement of 2, considering cutback to just 1: discuss • decision: cut back to just 1

–if you spot typo in book, let me know if it’s not already in errata list • http://www.cs.ubc.ca/~tmm/vadbook/errata.html • (but don’t count it as a question) • not useful to tell me about typos in published papers 2 Ch 1. What’s Vis, and Why Do It?

3 Why have a human in the loop?

Computer-based visualization systems provide visual representations of datasets designed to help people carry out tasks more effectively. Visualization is suitable when there is a need to augment human capabilities rather than replace people with computational decision-making methods.

• don’t need vis when fully automatic solution exists and is trusted • many analysis problems ill-specified – don’t know exactly what questions to ask in advance • possibilities – long-term use for end users (e.g. exploratory analysis of scientific data) – presentation of known results – stepping stone to better understanding of requirements before developing models – help developers of automatic solution refine/debug, determine parameters

– help end users of automatic solutions verify, build trust 4 Why use an external representation?

Computer-based visualization systems provide visual representations of datasets designed to help people carry out tasks more effectively. • external representation: replace cognition with perception

[Cerebral: Visualizing Multiple Experimental Conditions on a Graph with Biological Context. Barsky, Munzner, Gardy, and Kincaid. IEEE TVCG (Proc. InfoVis) 14(6):1253-1260, 2008.] 5 Anscombe’s Quartet: Raw Data 1 2 3 4 X Y X Y X Y X Y

Why represent all the data?

Computer-based visualization systems provide visual representations of datasets designed to help people carry out tasks more effectively. Mean • summaries lose information, details matterVariance – confirm expected and find unexpected patternsCorrelation – assess validity of statistical model Anscombe’s Quartet

Identical statistics x mean 9 x variance 10 y mean 7.5 y variance 3.75 x/y correlation 0.816 https://www.youtube.com/watch?v=DbJyPELmhJc

Same Stats, Different Graphs 6 Why focus on tasks and effectiveness?

Computer-based visualization systems provide visual representations of datasets designed to help people carry out tasks more effectively. • tasks serve as constraint on design (as does data) – idioms do not serve all tasks equally! – challenge: recast tasks from domain-specific vocabulary to abstract forms • most possibilities ineffective – validation is necessary, but tricky – increases chance of finding good solutions if you understand full space of possibilities • what counts as effective? – novel: enable entirely new kinds of analysis – faster: speed up existing workflows

7 Why are there resource limitations?

Vis designers must take into account three very different kinds of resource limitations: those of computers, of humans, and of displays. • computational limits – processing time – system memory • human limits – human attention and memory • display limits – pixels are precious resource, the most constrained resource – information density: ratio of space used to encode info vs unused whitespace • tradeoff between clutter and wasting space, find sweet spot between dense and sparse

8 Analysis: What, why, and how • what is shown? – data abstraction • why is the user looking at it? – task abstraction • how is it shown? – idiom: visual encoding and interaction

• abstract vocabulary avoids domain-specific terms – translation process iterative, tricky • what-why-how analysis framework as scaffold to think systematically about design space

9 Why analyze? SpaceTree TreeJuxtaposer • imposes structure on huge design space –scaffold to help you think systematically about choices –analyzing existing as stepping stone to designing new –most possibilities ineffective for [SpaceTree: Supporting Exploration in Large [TreeJuxtaposer: Scalable Tree Comparison Using particular task/data combination Node Link Tree, Design Evolution and Empirical Focus+Context With Guaranteed Visibility. ACM Trans. on Evaluation. Grosjean, Plaisant, and Bederson. Graphics (Proc. SIGGRAPH) 22:453– 462, 2003.] What? Why? How? Proc. InfoVis 2002, p 57–64.]

Tree Actions SpaceTree Present Locate Identify Encode Navigate Select Filter Aggregate

Targets TreeJuxtaposer Path between two nodes Encode Navigate Select Arrange

10 How? Encode Manipulate Facet Reduce

Arrange Change Juxtapose Filter Express Separate

Order Align Select Partition Aggregate

Use How? Navigate Superimpose Embed How? How? EncodeEncode ManipulEnatceode ManipulFacatete ManipulatRedue Facecet Facet ReduReducece

ArArangerrange MapArrCangehange ChangeJuxtapose ChangeFilJuterxtaposeJuxtapose FilterFilter Express Separate from cExpategoressrical and oSrepaderredat e Express Separaattteributes Color Hue Saturation Luminance Order Align OSelerderct Align Partition Select Aggregate Partition Aggregate

Order Align Size, Angle, CurvatureS, ele... ct Partition Aggregate Use Use Navigate Superimpose NavigatEmbede Superimpose Embed Shape What? Use Map Navigate Superimpose Embed Motionfrom categorical and ordered Why? from categorical and ordered Direction, Rate, Frequency, ... attributes attributes How? Color Color Hue Saturation Luminance Hue Saturation Luminance Map

fSirzome, Angl caet,ego Curvraicalture, and... ordered Size, Angle, Curvature, ... attributes

Color 11 Shape Shape What? Hue Saturation Luminance What?

Motion Why? Motion Direction, Rate, Frequency, ... Why? DirSiection,ze, R Aatengl, Frequene, Ccy,u ...rvature, ... How? How?

Shape What?

Motion Why? Direction, Rate, Frequency, ... How?

What? Datasets Attributes

Data Types Attribute Types Items Attributes Links Positions Grids Categorical

Data and Dataset Types Tables Networks & Fields Geometry Clusters, Ordered Trees sets, lists Ordinal Items Items (nodes) Grids Items Items Attributes Links Positions Positions Attributes Attributes Quantitative

Dataset Types Ordering Direction Tables Networks FieldsVAD (Continuous) Ch 2: Data Abstraction Sequential Attributes (columns) Grid of positions What? Datasets Attributes Link Items Cell (rows) Node Data Types Diverging Attribute Types (item) Items Attributes Links Positions Grids Categorical Cell containing value Attributes (columns)

Data and Dataset Types Value in cell Cyclic Multidimensional Trees Tables Networks & Fields Geometry Clusters, Ordered Trees Sets, Lists Ordinal Items Items (nodes) Grids Items Items Attributes Links Positions Positions Attributes Attributes Quantitative

Value in cell Dataset Types Ordering Direction Tables Networks Fields (Continuous) Sequential Attributes (columns) Grid of positions

Link (Spatial) Items Cell Geometry (rows) Node Diverging (item) Cell containing value Attributes (columns)

Value in cell Cyclic Multidimensional Table Trees Position

Value in cell Dataset Availability What? Geometry (Spatial) Static Dynamic Why?

Position How? Dataset Availability What? [VAD Fig 2.1] Static Dynamic Why? 12

How? Ch 2. What: Data Abstraction

13 Three major datatypes

DatasetDatasetDatasetDataset Types Types Types Types Dataset Types Tables Networks Fields (Continuous) Geometry (Spatial) TablesDatasetTablesTables Types NetworksTablesNetworksNetworks FieldsFieldsFields NetworksSpatial(Continuous) (Continuous)(Continuous) GeometryFieldsGeometryGeometry (Continuous) (Spatial) (Spatial)(Spatial) Geometry (Spatial)

Attributes (columns) Grid of positionsGrid of positions TablesAttributesAttributesAttributes (columns) (columns) (columns) NetworksAttributes (columns) FieldsGridGrid (Continuous) of of positions positions GeometryGrid of positions (Spatial) ItemsItems Link Link ItemsItems Items LinkLink Cell Cell Grid of positionsLink (rows)(rows) Attributes (columns) CellCell Cell Position Position (rows)(rows) (rows) Node Node PositionPosition Position NodeNode Node Items (item)Link(item) Cell containingCell containing value value (item)(item) AttributesCell (columns)Attributes (columns)(item) (rows) CellCell containing containing value value Cell containing value AttributesAttributes (columns) (columns) Attributes (columns) Position Node (item) Value in cell Value in cell Cell containing value AttributesValueValue in in (columns)cell cell Value in cell MultidimensionalMultidimensional Table Table Trees Trees MultidimensionalMultidimensional Table Table MultidimensionalTreesTrees Table Trees Value in cell Multidimensional Table Trees

Value in cell • visualization vs Value in cell ValueValue in in cell cell Value in cell –geometry is design decision

Value in cell

14 Attribute types Attributes

Attribute Types Categorical Ordered

Ordinal Quantitative

Ordering Direction

Sequential Diverging Cyclic

15 Dataset and data types

Data and Dataset Types Tables Networks & Fields Geometry Clusters, Trees Sets, Lists Items Items (nodes) Grids Items Items Attributes Links Positions Positions Attributes Attributes

Data Types Items Attributes Links Positions Grids Dataset Availability Static Dynamic

16 Further reading: Articles • Mathematics and the Internet: A Source of Enormous Confusion and Great Potential. Walter Willinger, David Alderson, and John C. Doyle. Notices of the AMS 56(5):586-599, 2009. • Rethinking Visualization: A High-Level Taxonomy. InfoVis 2004, p 151-158, 2004. • The Eyes Have It: A Task by Data Type Taxonomy for Information Visualizations , Proc. 1996 IEEE Visual Languages • The Structure of the Information Visualization Design Space. and Jock Mackinlay, Proc. InfoVis 97. • Polaris: A System for Query, Analysis and Visualization of Multi-dimensional Relational Databases. Chris Stolte, Diane Tang and , IEEE TVCG 8(1): 52-65 2002.

17 Further reading: Books • Visualization Analysis and Design. Munzner. CRC Press, 2014. –Chap 2: Data Abstraction • Information Visualization: Using Vision to Think. Stuart Card, Jock Mackinlay, and Ben Shneiderman. –Chap 1 • : Principles and Practice, 2nd ed. Alexandru Telea, CRC Press, 2014. • Interactive Data Visualization: Foundations, Techniques, and Applications, 2nd ed. Matthew O. Ward, Georges Grinstein, Daniel Keim. CRC Press, 2015. • The Visualization Handbook. Charles Hansen and Chris Johnson, eds. Academic Press, 2004. • Visualization Toolkit: An Object-Oriented Approach to 3D Graphics, 4th ed. Will Schroeder, Ken Martin, and Bill Lorensen. Kitware 2006. • Visualization of Time-Oriented Data. Wolfgang Aigner, Silvia Miksch, Heidrun Schumann, Chris Tominski. Springer 2011. 18 VAD Ch 3: Task Abstraction Why? Actions Targets

Analyze All Data Consume Trends Outliers Features Discover Present Enjoy

Attributes Produce Annotate Record Derive One Many

tag Distribution Dependency Correlation Similarity

Extremes Search

Target known Target unknown Location • {action, target} pairs Lookup Browse Network Data known Location Locate Explore Topology –discover distribution unknown –compare trends Query Paths –locate outliers Identify Compare Summarize What? –browse topology Spatial Data Why? Shape How?

[VAD Fig 3.1] 19 Actions High-level actions: Analyze • consume Analyze –discover vs present Consume Discover Present Enjoy • classic split • aka explore vs explain –enjoy • newcomer Produce • aka casual, social Annotate Record Derive

tag • produce –annotate, record –derive Search • crucial design choice Target known Target unknown

Location 20 Lookup Browse known

Location Locate Explore unknown

Query Identify Compare Summarise Derive • don’t just draw what you’re given! –decide what the right thing to show is –create it with a series of transformations from the original dataset –draw that • one of the four major strategies for handling complexity

exports imports trade balance

trade balance = exports −imports Derived Data Original Data 21 Actions

Analyze

Consume Discover Present Enjoy

Produce Annotate Record Derive

tag Actions: Mid-level search, low-level query

• what does user know? Search

–target, location Target known Target unknown

Location Lookup Browse • how much of the data known Location matters? Locate Explore unknown –one, some, all

Query • independent choices, Identify Compare Summarize mix & match –analyze, query, search

22 Why? Actions Targets

Analyze All Data Consume Trends Outliers Features Discover Present Enjoy

Attributes Produce Annotate Record Derive One Many

tag Distribution Dependency Correlation Similarity

Why? Extremes Targets Search Actions Targets Target known Target unknown Location Analyze All Data Lookup Browse Network Data known Consume Trends Outliers Features Location Locate Explore Topology Discover Present Enjoy unknown

Query Paths Attributes Identify Compare Summarize Produce What? Annotate Record Derive One Many Spatial Data tag Distribution Dependency Correlation Similarity Why? Shape Extremes How? Search

Target known Target unknown

Location 23 Lookup Browse Network Data known Location Locate Explore Topology unknown

Query Paths Identify Compare Summarize What? Spatial Data Why? Shape How? Analysis example: Compare idioms SpaceTree TreeJuxtaposer [SpaceTree: Supporting Exploration in Large Node Link Tree, Design Evolution and Empirical Evaluation. Grosjean, Plaisant, and Bederson. Proc. InfoVis 2002, p 57–64.]

[TreeJuxtaposer: Scalable Tree Comparison Using Focus+Context With Guaranteed Visibility. ACM Trans. on Graphics (Proc. SIGGRAPH) 22:453– 462, 2003.]

What? Why? How?

Tree Actions SpaceTree Present Locate Identify Encode Navigate Select Filter Aggregate

Targets TreeJuxtaposer Path between two nodes Encode Navigate Select Arrange

24 Analysis example: Derive one attribute • Strahler number – centrality metric for trees/networks – derived quantitative attribute – draw top 5K of 500K for good skeleton [Using Strahler numbers for real time visual exploration of huge graphs. Auber. Proc. Intl. Conf. Computer Vision and Graphics, pp. 56–69, 2002.]

Task 1 Task 2

.58 .74 .58 .74 .64 .64 .54 .84 .54 .84 .74 .84 .74 .84 .24 .84 .24 .84 .64 .64 .94 .94 In Out In In Out Tree Quantitative Tree + Quantitative Filtered Tree attribute on nodes attribute on nodes Removed unimportant parts

What? Why? What? Why? How? In Tree Derive In Tree Summarize Reduce Out Quantitative In Quantitative attribute on nodes Topology Filter attribute on nodes Out Filtered Tree 25 Chained sequences • output of one is input to next – express dependencies – separate means from ends

26 Design Study Methodology Reflections from the Trenches and from the Stacks

joint work with: Michael Sedlmair, http://www.cs.ubc.ca/labs/imager/tr/2012/dsm/

Design Study Methodology: Reflections from the Trenches and from the Stacks. Sedlmair, Meyer, Munzner. IEEE Trans. Visualization and Computer Graphics 18(12): 2431-2440, 2012 (Proc. InfoVis 2012). 27 Design Studies: Lessons learned after 21 of them

Cerebral MizBee Pathline MulteeSum Vismon QuestVis WiKeVis genomics genomics genomics genomics fisheries management sustainability in-car networks

MostVis Car-X-Ray ProgSpy2010 RelEx Cardiogram AutobahnVis VisTra in-car networks in-car networks in-car networks in-car networks in-car networks in-car networks in-car networks

Constellation LibVis Caidants SessionViewer LiveRAC PowerSetViewer LastHistory linguistics cultural heritage multicast web log analysis server hosting data mining music listening

• commonality of representations cross-cuts domains! 28 Methodology recipes

ingredients

Methods Methodology

29 Methodology for problem-driven work

ALGORITHM crisp AUTOMATION • definitions POSSIBLE DESIGN STUDY METHODOLOGY SUITABLE NOT ENOUGH DATA TASK CLARITY fuzzy head computer INFORMATION LOCATION • 9-stage framework

learn winnow cast discover design implement deploy reflect write

PRECONDITION CORE ANALYSIS personal validation inward-facing validation outward-facing validation • 32 pitfalls & how to avoid them alization researcher to explain hard-won knowledge about the domain PF-1 premature advance: jumping forward over stages general to the readers is understandable, it is usually a better choice to put PF-2 premature start: insufficient knowledge of vis literature learn writing effort into presenting extremely clear abstractions of the task PF-3 premature commitment: collaboration with wrong people winnow PF-4 no real data available (yet) winnow and data. Design study papers should include only the bare minimum PF-5 insufficient time available from potential collaborators winnow of domain knowledge that is required to understand these abstractions. PF-6 no need for visualization: problem can be automated winnow We have seen many examples of this pitfall as reviewers, and we con- PF-7 researcher expertise does not match domain problem winnow tinue to be reminded of it by reviewers of our own paper submissions. PF-8 no need for research: engineering vs. research project winnow We fell headfirst into it ourselves in a very early design study, which PF-9 no need for change: existing tools are good enough winnow would have been stronger if more space had been devoted to the ra- PF-10 no real/important/recurring task winnow tionale of geography as a proxy for network topology, and less to the PF-11 no rapport with collaborators winnow intricacies of overlay network configuration and the travails of map- PF-12 not identifying front line analyst and gatekeeper before start cast ping IP addresses to geographic locations [53]. PF-13 assuming every project will have the same role distribution cast • comparison to related methodologiesAnother challenge is to construct an interesting and useful story PF-14 mistaking fellow tool builders for real end users cast from the set of events that constitute a design study. First, the re- PF-15 ignoring practices that currently work well discover searcher must re-articulate what was unfamiliar at the start of the pro- PF-16 expecting just talking or fly on wall to work discover cess but has since become internalized and implicit. Moreover, the PF-17 experts focusing on visualization design vs. domain problem discover PF-18 learning their problems/language: too little / too much discover order of presentation and argumentation in a paper should follow a 30 logical thread that is rarely tied to the actual chronology of events due PF-19 abstraction: too little design to the iterative and cyclical nature of arriving at full understanding of PF-20 premature design commitment: consideration space too small design PF-21 mistaking technique-driven for problem-driven work design PF-31 the problem ( ). A careful selection of decisions made, and their PF-22 nonrapid prototyping implement justification, is imperative for narrating a compelling story about a de- PF-23 usability: too little / too much implement sign study and are worth discussing as part of the reflections on lessons PF-24 premature end: insufficient deploy time built into schedule deploy learned. In this spirit, writing a design study paper has much in com- PF-25 usage study not case study: non-real task/data/user deploy mon with writing for qualitative research in the social sciences. In PF-26 liking necessary but not sufficient for validation deploy that literature, the process of writing is seen as an important research PF-27 failing to improve guidelines: confirm, refine, reject, propose reflect component of sense-making from observations gathered in field work, PF-28 insufficient writing time built into schedule write above and beyond merely being a reporting method [62, 93]. PF-29 no technique contribution = good design study write 6 In technique-driven work, the goal of novelty means that there is a PF-30 too much domain background in paper write rush to publish as soon as possible. In problem-driven work, attempt- PF-31 story told chronologically vs. focus on final results write ing to publish too soon is a common mistake, leading to a submission PF-32 premature end: win race vs. practice music for debut write that is shallow and lacks depth (PF-32). We have fallen prey to this pit- fall ourselves more than once. In one case, a design study was rejected Table 1. Summary of the 32 design study pitfalls that we identified. upon first submission, and was only published after significantly more work was completed [10]; in retrospect, the original submission was premature. In another case, work that we now consider preliminary and interview; shedding preconceived notions is a tactic for reaching was accepted for publication [78]. After publication we made further this goal. Some of these methods have been adapted for use in HCI, refinements of the tool and validated the design with a field evaluation, however under a very different methodological umbrella. In these but these improvements and findings did not warrant a full second pa- fields the goal is to distill findings into implications for design, requir- per. We included this work as a secondary contribution in a later paper ing methods that quickly build an understanding of how a technology about lessons learned across many projects [76], but in retrospect we intervention might improve workflows. While some sternly critique should have waited to submit until later in the project life cycle. this approach [20, 21], we are firmly in the camp of authors such as It is rare that another group is pursuing exactly the same goal given Rogers [64, 65] who argues that goal-directed fieldwork is appropri- the enormous number of possible data and task combinations. Typi- ate when it is neither feasible nor desirable to capture everything, and cally a design requires several iterations before it is as effective as pos- Millen who advocates rapid ethnography [47]. This stand implies that sible, and the first version of a system most often does not constitute a our observations will be specific to visualization and likely will not be conclusive contribution. Similarly, reflecting on lessons learned from helpful in other fields; conversely, we assert that an observer without a the specific situation of study in order to derive new or refined gen- visualization background will not get the answers needed for abstract- eral guidelines typically requires an iterative process of thinking and ing the gathered information into visualization-compatible concepts. writing. A challenge for researchers who are familiar with technique- The methodology of grounded theory emphasizes building an un- driven work and who want to expand into embracing design studies is derstanding from the ground up based on careful and detailed anal- that the mental reflexes of these two modes of working are nearly op- ysis [14]. As with ethnography, we differ by advocating that valid posite. We offer a metaphor that technique-driven work is like running progress can be made with considerably less analysis time. Although a footrace, while problem-driven work is like preparing for a violin early proponents [87] cautioned against beginning the analysis pro- concert: deciding when to perform is part of the challenge and the cess with preconceived notions, our insistence that visualization re- primary hazard is halting before one’s full potential is reached, as op- searchers must have a solid foundation in visualization knowledge posed to the challenge of reaching a defined finish line first. aligns better with more recent interpretations [25] that advocate bring- ing a prepared mind to the project, a call echoed by others [63]. 5COMPARING METHODOLOGIES Many aspects of the action research (AR) methodology [27] align Design studies involve a significant amount of qualitative field work; with design study methodology. First is the idea of learning through we now compare design study methodolgy to influential methodolo- action, where intervention in the existing activities of the collabora- gies in HCI with similar qualitative intentions. We also use the ter- tive research partner is an explicit aim of the research agenda, and minology from these methodologies to buttress a key claim on how to prolonged engagement is required. A second resonance is the identifi- judge design studies: transferability is the goal, not reproducibility. cation of transferability rather than reproducability as the desired out- Ethnography is perhaps the most widely discussed qualitative re- come, as the aim is to create a solution for a specific problem. Indeed, search methodology in HCI [16, 29, 30]. Traditional ethnography in our emphasis on abstraction can be cast as a way to “share sufficient the fields of anthropology [6] and sociology [81] aims at building a knowledge about a solution that it may potentially be transferred to rich picture of a culture. The researcher is typically immersed for other contexts” [27]. The third key idea is that personal involvement many months or even years to build up a detailed understanding of life of the researcher is central and desirable, rather than being a dismaying and practice within the culture using methods that include observation incursion of subjectivity that is a threat to validity; van Wijk makes the Design studies: problem-driven vis research • a specific real-world problem –real users and real data, –collaboration is (often) fundamental • design a visualization system –implications: requirements, multiple ideas • validate the design –at appropriate levels • reflect about lessons learned –transferable research: improve design guidelines for vis in general • confirm, refine, reject, propose

31 Design study methodology: definitions

ALGORITHM crisp AUTOMATION POSSIBLE DESIGN STUDY METHODOLOGY SUITABLE NOT ENOUGH DATA TASK CLARITY fuzzy head computer INFORMATION LOCATION

32 9 stage framework

learn winnow cast discover design implement deploy reflect write

PRECONDITION CORE ANALYSIS

33 9-stage framework learn winnow cast

learnlearn winnowwinnow castcast discover design implement deploy reflect write

PRECONDITION CORE ANALYSIS

34 9-stage framework discover design implement deploy

learn winnow cast discover design implement deploy reflect write

PRECONDITION CORE ANALYSIS

35 9-stage framework reflect • guidelines: confirm, refine, reject, propose write

learn winnow cast discover design implement deploy reflect write

PRECONDITION CORE ANALYSIS

36 9-stage framework iterative

learn winnow cast discover design implement deploy reflect write

PRECONDITION CORE ANALYSIS

37 Design study methodology: 32 pitfalls

• and how to avoid them alization researcher to explain hard-won knowledge about the domain PF-1 premature advance: jumping forward over stages general to the readers is understandable, it is usually a better choice to put PF-2 premature start: insufficient knowledge of vis literature learn writing effort into presenting extremely clear abstractions of the task PF-3 premature commitment: collaboration with wrong people winnow PF-4 no real data available (yet) winnow and data. Design study papers should include only the bare minimum PF-5 insufficient time available from potential collaborators winnow of domain knowledge that is required to understand these abstractions. PF-6 no need for visualization: problem can be automated winnow We have seen many examples of this pitfall as reviewers, and we con- PF-7 researcher expertise does not match domain problem winnow tinue to be reminded of it by reviewers of our own paper submissions. PF-8 no need for research: engineering vs. research project winnow We fell headfirst into it ourselves in a very early design study, which PF-9 no need for change: existing tools are good enough winnow would have been stronger if more space had been devoted to the ra- PF-10 no real/important/recurring task winnow 38 tionale of geography as a proxy for network topology, and less to the PF-11 no rapport with collaborators winnow intricacies of overlay network configuration and the travails of map- PF-12 not identifying front line analyst and gatekeeper before start cast ping IP addresses to geographic locations [53]. PF-13 assuming every project will have the same role distribution cast Another challenge is to construct an interesting and useful story PF-14 mistaking fellow tool builders for real end users cast from the set of events that constitute a design study. First, the re- PF-15 ignoring practices that currently work well discover searcher must re-articulate what was unfamiliar at the start of the pro- PF-16 expecting just talking or fly on wall to work discover cess but has since become internalized and implicit. Moreover, the PF-17 experts focusing on visualization design vs. domain problem discover order of presentation and argumentation in a paper should follow a PF-18 learning their problems/language: too little / too much discover logical thread that is rarely tied to the actual chronology of events due PF-19 abstraction: too little design to the iterative and cyclical nature of arriving at full understanding of PF-20 premature design commitment: consideration space too small design PF-21 mistaking technique-driven for problem-driven work design PF-31 the problem ( ). A careful selection of decisions made, and their PF-22 nonrapid prototyping implement justification, is imperative for narrating a compelling story about a de- PF-23 usability: too little / too much implement sign study and are worth discussing as part of the reflections on lessons PF-24 premature end: insufficient deploy time built into schedule deploy learned. In this spirit, writing a design study paper has much in com- PF-25 usage study not case study: non-real task/data/user deploy mon with writing for qualitative research in the social sciences. In PF-26 liking necessary but not sufficient for validation deploy that literature, the process of writing is seen as an important research PF-27 failing to improve guidelines: confirm, refine, reject, propose reflect component of sense-making from observations gathered in field work, PF-28 insufficient writing time built into schedule write above and beyond merely being a reporting method [62, 93]. PF-29 no technique contribution = good design study write 6 In technique-driven work, the goal of novelty means that there is a PF-30 too much domain background in paper write rush to publish as soon as possible. In problem-driven work, attempt- PF-31 story told chronologically vs. focus on final results write ing to publish too soon is a common mistake, leading to a submission PF-32 premature end: win race vs. practice music for debut write that is shallow and lacks depth (PF-32). We have fallen prey to this pit- fall ourselves more than once. In one case, a design study was rejected Table 1. Summary of the 32 design study pitfalls that we identified. upon first submission, and was only published after significantly more work was completed [10]; in retrospect, the original submission was premature. In another case, work that we now consider preliminary and interview; shedding preconceived notions is a tactic for reaching was accepted for publication [78]. After publication we made further this goal. Some of these methods have been adapted for use in HCI, refinements of the tool and validated the design with a field evaluation, however under a very different methodological umbrella. In these but these improvements and findings did not warrant a full second pa- fields the goal is to distill findings into implications for design, requir- per. We included this work as a secondary contribution in a later paper ing methods that quickly build an understanding of how a technology about lessons learned across many projects [76], but in retrospect we intervention might improve workflows. While some sternly critique should have waited to submit until later in the project life cycle. this approach [20, 21], we are firmly in the camp of authors such as It is rare that another group is pursuing exactly the same goal given Rogers [64, 65] who argues that goal-directed fieldwork is appropri- the enormous number of possible data and task combinations. Typi- ate when it is neither feasible nor desirable to capture everything, and cally a design requires several iterations before it is as effective as pos- Millen who advocates rapid ethnography [47]. This stand implies that sible, and the first version of a system most often does not constitute a our observations will be specific to visualization and likely will not be conclusive contribution. Similarly, reflecting on lessons learned from helpful in other fields; conversely, we assert that an observer without a the specific situation of study in order to derive new or refined gen- visualization background will not get the answers needed for abstract- eral guidelines typically requires an iterative process of thinking and ing the gathered information into visualization-compatible concepts. writing. A challenge for researchers who are familiar with technique- The methodology of grounded theory emphasizes building an un- driven work and who want to expand into embracing design studies is derstanding from the ground up based on careful and detailed anal- that the mental reflexes of these two modes of working are nearly op- ysis [14]. As with ethnography, we differ by advocating that valid posite. We offer a metaphor that technique-driven work is like running progress can be made with considerably less analysis time. Although a footrace, while problem-driven work is like preparing for a violin early proponents [87] cautioned against beginning the analysis pro- concert: deciding when to perform is part of the challenge and the cess with preconceived notions, our insistence that visualization re- primary hazard is halting before one’s full potential is reached, as op- searchers must have a solid foundation in visualization knowledge posed to the challenge of reaching a defined finish line first. aligns better with more recent interpretations [25] that advocate bring- ing a prepared mind to the project, a call echoed by others [63]. 5COMPARING METHODOLOGIES Many aspects of the action research (AR) methodology [27] align Design studies involve a significant amount of qualitative field work; with design study methodology. First is the idea of learning through we now compare design study methodolgy to influential methodolo- action, where intervention in the existing activities of the collabora- gies in HCI with similar qualitative intentions. We also use the ter- tive research partner is an explicit aim of the research agenda, and minology from these methodologies to buttress a key claim on how to prolonged engagement is required. A second resonance is the identifi- judge design studies: transferability is the goal, not reproducibility. cation of transferability rather than reproducability as the desired out- Ethnography is perhaps the most widely discussed qualitative re- come, as the aim is to create a solution for a specific problem. Indeed, search methodology in HCI [16, 29, 30]. Traditional ethnography in our emphasis on abstraction can be cast as a way to “share sufficient the fields of anthropology [6] and sociology [81] aims at building a knowledge about a solution that it may potentially be transferred to rich picture of a culture. The researcher is typically immersed for other contexts” [27]. The third key idea is that personal involvement many months or even years to build up a detailed understanding of life of the researcher is central and desirable, rather than being a dismaying and practice within the culture using methods that include observation incursion of subjectivity that is a threat to validity; van Wijk makes the Collaboration incentives: Bidirectional • what’s in it for domain scientist? – win: access to more suitable tools, can do better/faster/cheaper science – time spent could pay off with earlier access and/or more customized tools • what’s in it for vis? – win: access to better understanding of your driving problems • crucial element in building effective tools to help – opportunities to observe how you use them • if they’re good enough, vis win: research success stories – leads us to develop guidelines on how to build better tools in general • vis win: research progress in visualization • [The Computer Scientist as Toolsmith II, Fred Brooks, CACM 30(3):61-68 1996]

39 PITFALL I’m a domain expert! Wanna collaborate?

PREMATURE COLLABORATION Of course!!! COMMITMENT

40 METAPHOR Winnowing

41 Collaborator winnowing

initial conversation (potential collaborators)

42 Collaborator winnowing

initial conversation

further meetings

43 Collaborator winnowing

initial conversation

further meetings

prototyping

44 Collaborator winnowing

initial conversation

further meetings

prototyping

full collaboration collaborator

45 Collaborator winnowing

initial conversation

further meetingsTalk with many, prototypingstay with few! full collaboration

46 Design study methodology: 32 pitfalls

• and how to avoid them alization researcher to explain hard-won knowledge about the domain PF-1 premature advance: jumping forward over stages general to the readers is understandable, it is usually a better choice to put PF-2 premature start: insufficient knowledge of vis literature learn writing effort into presenting extremely clear abstractions of the task PF-3 premature commitment: collaboration with wrong people winnow PF-4 no real data available (yet) winnow and data. Design study papers should include only the bare minimum PF-5 insufficient time available from potential collaborators winnow of domain knowledge that is required to understand these abstractions. PF-6 no need for visualization: problem can be automated winnow We have seen many examples of this pitfall as reviewers, and we con- PF-7 researcher expertise does not match domain problem winnow tinue to be reminded of it by reviewers of our own paper submissions. PF-8 no need for research: engineering vs. research project winnow We fell headfirst into it ourselves in a very early design study, which PF-9 no need for change: existing tools are good enough winnow would have been stronger if more space had been devoted to the ra- PF-10 no real/important/recurring task winnow 47 tionale of geography as a proxy for network topology, and less to the PF-11 no rapport with collaborators winnow intricacies of overlay network configuration and the travails of map- PF-12 not identifying front line analyst and gatekeeper before start cast ping IP addresses to geographic locations [53]. PF-13 assuming every project will have the same role distribution cast Another challenge is to construct an interesting and useful story PF-14 mistaking fellow tool builders for real end users cast from the set of events that constitute a design study. First, the re- PF-15 ignoring practices that currently work well discover searcher must re-articulate what was unfamiliar at the start of the pro- PF-16 expecting just talking or fly on wall to work discover cess but has since become internalized and implicit. Moreover, the PF-17 experts focusing on visualization design vs. domain problem discover order of presentation and argumentation in a paper should follow a PF-18 learning their problems/language: too little / too much discover logical thread that is rarely tied to the actual chronology of events due PF-19 abstraction: too little design to the iterative and cyclical nature of arriving at full understanding of PF-20 premature design commitment: consideration space too small design PF-21 mistaking technique-driven for problem-driven work design PF-31 the problem ( ). A careful selection of decisions made, and their PF-22 nonrapid prototyping implement justification, is imperative for narrating a compelling story about a de- PF-23 usability: too little / too much implement sign study and are worth discussing as part of the reflections on lessons PF-24 premature end: insufficient deploy time built into schedule deploy learned. In this spirit, writing a design study paper has much in com- PF-25 usage study not case study: non-real task/data/user deploy mon with writing for qualitative research in the social sciences. In PF-26 liking necessary but not sufficient for validation deploy that literature, the process of writing is seen as an important research PF-27 failing to improve guidelines: confirm, refine, reject, propose reflect component of sense-making from observations gathered in field work, PF-28 insufficient writing time built into schedule write above and beyond merely being a reporting method [62, 93]. PF-29 no technique contribution = good design study write 6 In technique-driven work, the goal of novelty means that there is a PF-30 too much domain background in paper write rush to publish as soon as possible. In problem-driven work, attempt- PF-31 story told chronologically vs. focus on final results write ing to publish too soon is a common mistake, leading to a submission PF-32 premature end: win race vs. practice music for debut write that is shallow and lacks depth (PF-32). We have fallen prey to this pit- fall ourselves more than once. In one case, a design study was rejected Table 1. Summary of the 32 design study pitfalls that we identified. upon first submission, and was only published after significantly more work was completed [10]; in retrospect, the original submission was premature. In another case, work that we now consider preliminary and interview; shedding preconceived notions is a tactic for reaching was accepted for publication [78]. After publication we made further this goal. Some of these methods have been adapted for use in HCI, refinements of the tool and validated the design with a field evaluation, however under a very different methodological umbrella. In these but these improvements and findings did not warrant a full second pa- fields the goal is to distill findings into implications for design, requir- per. We included this work as a secondary contribution in a later paper ing methods that quickly build an understanding of how a technology about lessons learned across many projects [76], but in retrospect we intervention might improve workflows. While some sternly critique should have waited to submit until later in the project life cycle. this approach [20, 21], we are firmly in the camp of authors such as It is rare that another group is pursuing exactly the same goal given Rogers [64, 65] who argues that goal-directed fieldwork is appropri- the enormous number of possible data and task combinations. Typi- ate when it is neither feasible nor desirable to capture everything, and cally a design requires several iterations before it is as effective as pos- Millen who advocates rapid ethnography [47]. This stand implies that sible, and the first version of a system most often does not constitute a our observations will be specific to visualization and likely will not be conclusive contribution. Similarly, reflecting on lessons learned from helpful in other fields; conversely, we assert that an observer without a the specific situation of study in order to derive new or refined gen- visualization background will not get the answers needed for abstract- eral guidelines typically requires an iterative process of thinking and ing the gathered information into visualization-compatible concepts. writing. A challenge for researchers who are familiar with technique- The methodology of grounded theory emphasizes building an un- driven work and who want to expand into embracing design studies is derstanding from the ground up based on careful and detailed anal- that the mental reflexes of these two modes of working are nearly op- ysis [14]. As with ethnography, we differ by advocating that valid posite. We offer a metaphor that technique-driven work is like running progress can be made with considerably less analysis time. Although a footrace, while problem-driven work is like preparing for a violin early proponents [87] cautioned against beginning the analysis pro- concert: deciding when to perform is part of the challenge and the cess with preconceived notions, our insistence that visualization re- primary hazard is halting before one’s full potential is reached, as op- searchers must have a solid foundation in visualization knowledge posed to the challenge of reaching a defined finish line first. aligns better with more recent interpretations [25] that advocate bring- ing a prepared mind to the project, a call echoed by others [63]. 5COMPARING METHODOLOGIES Many aspects of the action research (AR) methodology [27] align Design studies involve a significant amount of qualitative field work; with design study methodology. First is the idea of learning through we now compare design study methodolgy to influential methodolo- action, where intervention in the existing activities of the collabora- gies in HCI with similar qualitative intentions. We also use the ter- tive research partner is an explicit aim of the research agenda, and minology from these methodologies to buttress a key claim on how to prolonged engagement is required. A second resonance is the identifi- judge design studies: transferability is the goal, not reproducibility. cation of transferability rather than reproducability as the desired out- Ethnography is perhaps the most widely discussed qualitative re- come, as the aim is to create a solution for a specific problem. Indeed, search methodology in HCI [16, 29, 30]. Traditional ethnography in our emphasis on abstraction can be cast as a way to “share sufficient the fields of anthropology [6] and sociology [81] aims at building a knowledge about a solution that it may potentially be transferred to rich picture of a culture. The researcher is typically immersed for other contexts” [27]. The third key idea is that personal involvement many months or even years to build up a detailed understanding of life of the researcher is central and desirable, rather than being a dismaying and practice within the culture using methods that include observation incursion of subjectivity that is a threat to validity; van Wijk makes the considerations Research problem for Have data? Have time? Have need? ...

48 Design study methodology: 32 pitfalls

49 roles bioinformatician biologist ... or maybe a fellow tool Are you a builder? user???

50 Examples from the trenches • premature collaboration • fellow tool builders with inaccurate assumptions about user needs • data unavailable early so didn’t diagnose problems

PowerSet Viewer WikeVis 2 years / 4 researchers 0.5 years / 2 researchers

51 Design study methodology: 32 pitfalls

52 I want a tool with that PITFALL cool technique I saw the other day! PREMATURE DESIGN COMMITMENT

53 PITFALL Of course they need the cool technique I built last year! PREMATURE DESIGN COMMITMENT

54 METAPHOR Design Space - o + - + good o o o o o okay - - - - poor o - - o - + o - oneo technique... - - o + - o - - +

55 METAPHOR Design Space - small o + - know o o scope o o - - -

o - - o - + o - o - - o + - o - - +

56 Design study methodology: 32 pitfalls

• and how to avoid them alization researcher to explain hard-won knowledge about the domain PF-1 premature advance: jumping forward over stages general to the readers is understandable, it is usually a better choice to put PF-2 premature start: insufficient knowledge of vis literature learn writing effort into presenting extremely clear abstractions of the task PF-3 premature commitment: collaboration with wrong people winnow PF-4 no real data available (yet) winnow and data. Design study papers should include only the bare minimum PF-5 insufficient time available from potential collaborators winnow of domain knowledge that is required to understand these abstractions. PF-6 no need for visualization: problem can be automated winnow We have seen many examples of this pitfall as reviewers, and we con- PF-7 researcher expertise does not match domain problem winnow tinue to be reminded of it by reviewers of our own paper submissions. PF-8 no need for research: engineering vs. research project winnow We fell headfirst into it ourselves in a very early design study, which PF-9 no need for change: existing tools are good enough winnow would have been stronger if more space had been devoted to the ra- PF-10 no real/important/recurring task winnow 57 tionale of geography as a proxy for network topology, and less to the PF-11 no rapport with collaborators winnow intricacies of overlay network configuration and the travails of map- PF-12 not identifying front line analyst and gatekeeper before start cast ping IP addresses to geographic locations [53]. PF-13 assuming every project will have the same role distribution cast Another challenge is to construct an interesting and useful story PF-14 mistaking fellow tool builders for real end users cast from the set of events that constitute a design study. First, the re- PF-15 ignoring practices that currently work well discover searcher must re-articulate what was unfamiliar at the start of the pro- PF-16 expecting just talking or fly on wall to work discover cess but has since become internalized and implicit. Moreover, the PF-17 experts focusing on visualization design vs. domain problem discover order of presentation and argumentation in a paper should follow a PF-18 learning their problems/language: too little / too much discover logical thread that is rarely tied to the actual chronology of events due PF-19 abstraction: too little design to the iterative and cyclical nature of arriving at full understanding of PF-20 premature design commitment: consideration space too small design PF-21 mistaking technique-driven for problem-driven work design PF-31 the problem ( ). A careful selection of decisions made, and their PF-22 nonrapid prototyping implement justification, is imperative for narrating a compelling story about a de- PF-23 usability: too little / too much implement sign study and are worth discussing as part of the reflections on lessons PF-24 premature end: insufficient deploy time built into schedule deploy learned. In this spirit, writing a design study paper has much in com- PF-25 usage study not case study: non-real task/data/user deploy mon with writing for qualitative research in the social sciences. In PF-26 liking necessary but not sufficient for validation deploy that literature, the process of writing is seen as an important research PF-27 failing to improve guidelines: confirm, refine, reject, propose reflect component of sense-making from observations gathered in field work, PF-28 insufficient writing time built into schedule write above and beyond merely being a reporting method [62, 93]. PF-29 no technique contribution = good design study write 6 In technique-driven work, the goal of novelty means that there is a PF-30 too much domain background in paper write rush to publish as soon as possible. In problem-driven work, attempt- PF-31 story told chronologically vs. focus on final results write ing to publish too soon is a common mistake, leading to a submission PF-32 premature end: win race vs. practice music for debut write that is shallow and lacks depth (PF-32). We have fallen prey to this pit- fall ourselves more than once. In one case, a design study was rejected Table 1. Summary of the 32 design study pitfalls that we identified. upon first submission, and was only published after significantly more work was completed [10]; in retrospect, the original submission was premature. In another case, work that we now consider preliminary and interview; shedding preconceived notions is a tactic for reaching was accepted for publication [78]. After publication we made further this goal. Some of these methods have been adapted for use in HCI, refinements of the tool and validated the design with a field evaluation, however under a very different methodological umbrella. In these but these improvements and findings did not warrant a full second pa- fields the goal is to distill findings into implications for design, requir- per. We included this work as a secondary contribution in a later paper ing methods that quickly build an understanding of how a technology about lessons learned across many projects [76], but in retrospect we intervention might improve workflows. While some sternly critique should have waited to submit until later in the project life cycle. this approach [20, 21], we are firmly in the camp of authors such as It is rare that another group is pursuing exactly the same goal given Rogers [64, 65] who argues that goal-directed fieldwork is appropri- the enormous number of possible data and task combinations. Typi- ate when it is neither feasible nor desirable to capture everything, and cally a design requires several iterations before it is as effective as pos- Millen who advocates rapid ethnography [47]. This stand implies that sible, and the first version of a system most often does not constitute a our observations will be specific to visualization and likely will not be conclusive contribution. Similarly, reflecting on lessons learned from helpful in other fields; conversely, we assert that an observer without a the specific situation of study in order to derive new or refined gen- visualization background will not get the answers needed for abstract- eral guidelines typically requires an iterative process of thinking and ing the gathered information into visualization-compatible concepts. writing. A challenge for researchers who are familiar with technique- The methodology of grounded theory emphasizes building an un- driven work and who want to expand into embracing design studies is derstanding from the ground up based on careful and detailed anal- that the mental reflexes of these two modes of working are nearly op- ysis [14]. As with ethnography, we differ by advocating that valid posite. We offer a metaphor that technique-driven work is like running progress can be made with considerably less analysis time. Although a footrace, while problem-driven work is like preparing for a violin early proponents [87] cautioned against beginning the analysis pro- concert: deciding when to perform is part of the challenge and the cess with preconceived notions, our insistence that visualization re- primary hazard is halting before one’s full potential is reached, as op- searchers must have a solid foundation in visualization knowledge posed to the challenge of reaching a defined finish line first. aligns better with more recent interpretations [25] that advocate bring- ing a prepared mind to the project, a call echoed by others [63]. 5COMPARING METHODOLOGIES Many aspects of the action research (AR) methodology [27] align Design studies involve a significant amount of qualitative field work; with design study methodology. First is the idea of learning through we now compare design study methodolgy to influential methodolo- action, where intervention in the existing activities of the collabora- gies in HCI with similar qualitative intentions. We also use the ter- tive research partner is an explicit aim of the research agenda, and minology from these methodologies to buttress a key claim on how to prolonged engagement is required. A second resonance is the identifi- judge design studies: transferability is the goal, not reproducibility. cation of transferability rather than reproducability as the desired out- Ethnography is perhaps the most widely discussed qualitative re- come, as the aim is to create a solution for a specific problem. Indeed, search methodology in HCI [16, 29, 30]. Traditional ethnography in our emphasis on abstraction can be cast as a way to “share sufficient the fields of anthropology [6] and sociology [81] aims at building a knowledge about a solution that it may potentially be transferred to rich picture of a culture. The researcher is typically immersed for other contexts” [27]. The third key idea is that personal involvement many months or even years to build up a detailed understanding of life of the researcher is central and desirable, rather than being a dismaying and practice within the culture using methods that include observation incursion of subjectivity that is a threat to validity; van Wijk makes the METAPHOR Design Space - o + - + good o o o o o okay - - - - poor o - - o - + o - o - - o + - o - - +

58 METAPHOR Design Space - broad o + - know o o scope o o - - -

o - - o - + o - o - - o + - o - - +

59 METAPHOR Design Space - o + - know o o o o - - - consider o - - o - + o - o - - o + - o - - +

60 METAPHOR Design Space - o + - know o o o o - - - consider o - - o - + o - o - - propose o + - o - - +

61 METAPHOR Design Space - o + - know o o o o - - - consider o - - o - + o - o - - propose o + - o - - + select

62 METAPHOR Design Space - o + - + good o o o o o okay - - - - poor o - Think- o - + consider o - o - - obroad!+ propose - o - - + select 63 Design study methodology: 32 pitfalls

64 I want a tool with that PITFALL cool technique I saw the other day! PREMATURE DESIGN COMMITMENT Tell me more about DOMAIN EXPERTS your current FOCUSED ON VIS workflow problems! DESIGN VS DOMAIN PROBLEM

65 Design study methodology: 32 pitfalls

66 Pitfall Example: Premature Publishing

algorithm innovation design studies

Must be first! Am I ready?

http://www.prlog.org/10480334-wolverhampton-horse-racing-live-streaming-wolverhampton-handicap-8- 67 jan-2010.html http://www.alaineknipes.com/interests/violin_concert.jpg • metaphor: horse race vs. music debut Further reading: Design studies

• BallotMaps: Detecting Name Bias in Alphabetically Ordered Ballot Papers. Jo Wood, Donia Badawood, Jason Dykes, Aidan Slingsby. IEEE TVCG 17(12): 2384-2391 (Proc InfoVis 2011). • MulteeSum: A Tool for Comparative Temporal Gene Expression and Spatial Data. Miriah Meyer, Tamara Munzner, Angela DePace and Hanspeter Pfister. IEEE Trans. Visualization and Computer Graphics 16(6):908-917 (Proc. InfoVis 2010), 2010. • Pathline: A Tool for Comparative Functional Genomics. Miriah Meyer, , Tamara Munzner, Mark Styczynski and Hanspeter Pfister. Computer Graphics Forum (Proc. EuroVis 2010), 29(3):1043-1052 • SignalLens: Focus+Context Applied to Electronic Time Series. Robert Kincaid. IEEE Transactions on Visualization and Computer Graphics (Proc. InfoVis 2010), 16(6):900-907, 2010. • ABySS-Explorer: Visualizing genome sequence assemblies. Cydney B. Nielsen, Shaun D. Jackman, Inanc Birol, Steven J.M. Jones. IEEE Transactions on Visualization and Computer Graphics (Proc InfoVis 2009) 15(6):881-8, 2009. • Interactive Coordinated Multiple-View Visualization of Biomechanical Motion Data. Daniel F. Keefe, Marcus Ewert, William Ribarsky, Remco Chang. IEEE Trans. Visualization and Computer Graphics (Proc. Vis 2009), 15(6):1383-1390, 2009. • MizBee: A Multiscale Synteny Browser. Miriah Meyer, Tamara Munzner, and Hanspeter Pfister. IEEE Trans. Visualization and Computer Graphics (Proc. InfoVis 09), 15(6):897-904, 2009. • MassVis: Visual Analysis of Protein Complexes Using Mass Spectrometry. Robert Kincaid and Kurt Dejgaard. IEEE Symp Science and Technology (VAST 2009), p 163-170, 2009. • Cerebral: Visualizing Multiple Experimental Conditions on a Graph with Biological Context. Aaron Barsky, Tamara Munzner, Jennifer L. Gardy, and Robert Kincaid. IEEE Transactions on Visualization and Computer Graphics (Proc. InfoVis 2008) 14(6) (Nov-Dec) 2008, p 1253-1260. • Visual Exploration and Analysis of Historic Hotel Visits. Chris Weaver, David Fyfe, Anthony Robinson, Deryck W. Holdsworth, Donna J. Peuquet and Alan M. MacEachren. Information Visualization (Special Issue on Visual Analytics), Feb 2007. • Session Viewer: Visual Exploratory Analysis of Web Session Logs. Heidi Lam, Daniel Russell, Diane Tang, and Tamara Munzner. Proc. IEEE Symposium on Visual Analytics Science and Technology (VAST), p 147-154, 2007. • Exploratory visualization of array-based comparative genomic hybridization. Robert Kincaid, Amir Ben-Dor, and Zohar Yakhini. Information Visualization (2005) 4, 176-190. • Coordinated Graph and Scatter- Views for the Visual Exploration of Microarray Time-Series Data Paul Craig and Jessie Kennedy, Proc. InfoVis 2003, p 173-180. • Cluster and Calendar based Visualization of Time Series Data. Jarke J. van Wijk and Edward R. van Selow, Proc. InfoVis 1999, p 4-9. • Constellation: A Visualization Tool For Linguistic Queries from MindNet. Tamara Munzner, Francois Guimbretiere, and George Robertson. Proc. InfoVis 1999, p 132-135.

68 Break

69 In-class exercise: Abstraction

70 Next Time • to read – VAD Ch. 4: Validation – VAD Ch. 5: Marks and Channels – VAD Ch 6: Rules of Thumb – paper: Artery Viz

• reminder: my office hours are right after class (Tue 5pm) – super-quick stuff in classroom – everything else in my office X661

71