Visualization Analysis & Design

Tamara Munzner Department of Computer Science University of British Columbia

NASA Goddard Science and Technology Colloquium December 14 2016, Greenbelt MD http://www.cs.ubc.ca/~tmm/talks.html#vad16nasa @tamaramunzner (vis) defined & motivated

Computer-based visualization systems provide visual representations of datasets designed to help people carry out tasks more effectively. Visualization is suitable when there is a need to augment human capabilities rather than replace people with computational decision-making methods.

• human in the loop needs the details –doesn't know exactly what questions to ask in advance –longterm exploratory analysis –presentation of known results –stepping stone towards automation: refining, trustbuilding • external representation: perception vs cognition • intended task, measurable definitions of effectiveness more at: Visualization Analysis and Design, Chapter 1. Munzner. AK Peters Visualization Series, CRC Press, 2014. 2 Why analyze? SpaceTree TreeJuxtaposer • imposes a structure on huge design space –scaffold to help you think systematically about choices –analyzing existing as stepping stone to designing new

[SpaceTree: Supporting Exploration in Large [TreeJuxtaposer: Scalable Tree Comparison Using Focus Node Link Tree, Design Evolution and Empirical +Context With Guaranteed Visibility. ACM Trans. on Evaluation. Grosjean, Plaisant, and Bederson. Graphics (Proc. SIGGRAPH) 22:453– 462, 2003.] What? Why? How? Proc. InfoVis 2002, p 57–64.]

Tree Actions SpaceTree Present Locate Identify Encode Navigate Select Filter Aggregate

Targets TreeJuxtaposer Path between two nodes Encode Navigate Select Arrange

3 Analysis framework: Four levels, three questions domain • domain situation abstraction –who are the target users? idiom • abstraction algorithm –translate from specifics of domain to vocabulary of vis [A Nested Model of Visualization Design and Validation. Munzner. IEEE TVCG 15(6):921-928, 2009 (Proc. InfoVis 2009). ] • what is shown? data abstraction • often don’t just draw what you’re given: transform to new form domain abstraction • why is the user looking at it? task abstraction • idiom idiom • how is it shown? algorithm • visual encoding idiom: how to draw • interaction idiom: how to manipulate [A Multi-Level Typology of Abstract Visualization Tasks • algorithm Brehmer and Munzner. IEEE TVCG 19(12):2376-2385, 2013 (Proc. InfoVis 2013). ]

–efficient computation 4 Why is validation difficult? • different ways to get it wrong at each level

Domain situation You misunderstood their needs

Data/task abstraction You’re showing them the wrong thing

Visual encoding/interaction idiom The way you show it doesn’t work

Algorithm Your code is too slow

5 Why is validation difficult? • solution: use methods from different fields at each level

Domain situation problem-driven anthropology/ Observe target users using existing tools work ethnography Data/task abstraction

Visual encoding/interaction idiom design Justify design with respect to alternatives

computer Algorithm technique-driven science Measure system time/memory Analyze computational complexity work cognitive Analyze results qualitatively psychology Measure human time with lab experiment (lab study) anthropology/ Observe target users after deployment ( ) ethnography Measure adoption

[A Nested Model of Visualization Design and Validation. Munzner. IEEE TVCG 15(6):921-928, 2009 (Proc. InfoVis 2009). ] 6 Michael Sedlmair

Miriah Meyer Design Study Methodology Reflections from the Trenches and from the Stacks

Tamara Munzner http://www.cs.ubc.ca/labs/imager/tr/2012/dsm/ @tamaramunzner

Design Study Methodology: Reflections from the Trenches and from the Stacks. Sedlmair, Meyer, Munzner. IEEE Trans. Visualization and 18(12): 2431-2440, 2012 (Proc. InfoVis 2012). 7 Design Studies: Lessons learned after 21 of them

Cerebral MizBee Pathline MulteeSum Vismon QuestVis WiKeVis genomics genomics genomics genomics fisheries management sustainability in-car networks

MostVis Car-X-Ray ProgSpy2010 RelEx Cardiogram AutobahnVis VisTra in-car networks in-car networks in-car networks in-car networks in-car networks in-car networks in-car networks

Constellation LibVis Caidants SessionViewer LiveRAC PowerSetViewer LastHistory linguistics cultural heritage multicast web log analysis server hosting data mining music listening

8 Methodology for Problem-Driven Work

ALGORITHM crisp AUTOMATION POSSIBLE DESIGN STUDY • definitions METHODOLOGY SUITABLE NOT ENOUGH DATA TASK CLARITY fuzzy head computer INFORMATION LOCATION

• 9-stage framework

learn winnow cast discover design implement deploy reflect write

PRECONDITION CORE ANALYSIS personal validation inward-facing validation outward-facing validation 32 pitfalls • alization researcher to explain hard-won knowledge about the domain PF-1 premature advance: jumping forward over stages general to the readers is understandable, it is usually a better choice to put PF-2 premature start: insufficient knowledge of vis literature learn writing effort into presenting extremely clear abstractions of the task PF-3 premature commitment: collaboration with wrong people winnow and how to avoid them PF-4 no real data available (yet) winnow and data. Design study papers should include only the bare minimum PF-5 insufficient time available from potential collaborators winnow of domain knowledge that is required to understand these abstractions. PF-6 no need for visualization: problem can be automated winnow We have seen many examples of this pitfall as reviewers, and we con- PF-7 researcher expertise does not match domain problem winnow tinue to be reminded of it by reviewers of our own paper submissions. PF-8 no need for research: engineering vs. research project winnow We fell headfirst into it ourselves in a very early design study, which PF-9 no need for change: existing tools are good enough winnow would have been stronger if more space had been devoted to the ra- PF-10 no real/important/recurring task winnow tionale of geography as a proxy for network topology, and less to the PF-11 no rapport with collaborators winnow intricacies of overlay network configuration and the travails of - PF-12 not identifying front line analyst and gatekeeper before start cast ping IP addresses to geographic locations [53]. PF-13 assuming every project will have the same role distribution cast 9 Another challenge is to construct an interesting and useful story PF-14 mistaking fellow tool builders for real end users cast from the set of events that constitute a design study. First, the re- PF-15 ignoring practices that currently work well discover searcher must re-articulate what was unfamiliar at the start of the pro- PF-16 expecting just talking or fly on wall to work discover cess but has since become internalized and implicit. Moreover, the PF-17 experts focusing on visualization design vs. domain problem discover order of presentation and argumentation in a paper should follow a PF-18 learning their problems/language: too little / too much discover logical thread that is rarely tied to the actual chronology of events due PF-19 abstraction: too little design to the iterative and cyclical nature of arriving at full understanding of PF-20 premature design commitment: consideration space too small design PF-21 mistaking technique-driven for problem-driven work design PF-31 the problem ( ). A careful selection of decisions made, and their PF-22 nonrapid prototyping implement justification, is imperative for narrating a compelling story about a de- PF-23 usability: too little / too much implement sign study and are worth discussing as part of the reflections on lessons PF-24 premature end: insufficient deploy time built into schedule deploy learned. In this spirit, writing a design study paper has much in com- PF-25 usage study not case study: non-real task/data/user deploy mon with writing for qualitative research in the social sciences. In PF-26 liking necessary but not sufficient for validation deploy that literature, the process of writing is seen as an important research PF-27 failing to improve guidelines: confirm, refine, reject, propose reflect component of sense-making from observations gathered in field work, PF-28 insufficient writing time built into schedule write above and beyond merely being a reporting method [62, 93]. PF-29 no technique contribution = good design study write 6 In technique-driven work, the goal of novelty means that there is a PF-30 too much domain background in paper write rush to publish as soon as possible. In problem-driven work, attempt- PF-31 story told chronologically vs. focus on final results write ing to publish too soon is a common mistake, leading to a submission PF-32 premature end: win race vs. practice music for debut write that is shallow and lacks depth (PF-32). We have fallen prey to this pit- fall ourselves more than once. In one case, a design study was rejected 1. Summary of the 32 design study pitfalls that we identified. upon first submission, and was only published after significantly more work was completed [10]; in retrospect, the original submission was premature. In another case, work that we now consider preliminary and interview; shedding preconceived notions is a tactic for reaching was accepted for publication [78]. After publication we made further this goal. Some of these methods have been adapted for use in HCI, refinements of the tool and validated the design with a field evaluation, however under a very different methodological umbrella. In these but these improvements and findings did not warrant a full second pa- fields the goal is to distill findings into implications for design, requir- per. We included this work as a secondary contribution in a later paper ing methods that quickly build an understanding of how a technology about lessons learned across many projects [76], but in retrospect we intervention might improve workflows. While some sternly critique should have waited to submit until later in the project life cycle. this approach [20, 21], we are firmly in the camp of authors such as It is rare that another group is pursuing exactly the same goal given Rogers [64, 65] who argues that goal-directed fieldwork is appropri- the enormous number of possible data and task combinations. Typi- ate when it is neither feasible nor desirable to capture everything, and cally a design requires several iterations before it is as effective as pos- Millen who advocates rapid ethnography [47]. This stand implies that sible, and the first version of a system most often does not constitute a our observations will be specific to visualization and likely will not be conclusive contribution. Similarly, reflecting on lessons learned from helpful in other fields; conversely, we assert that an observer without a the specific situation of study in order to derive new or refined gen- visualization background will not get the answers needed for abstract- eral guidelines typically requires an iterative process of thinking and ing the gathered information into visualization-compatible concepts. writing. A challenge for researchers who are familiar with technique- The methodology of grounded theory emphasizes building an un- driven work and who want to expand into embracing design studies is derstanding from the ground up based on careful and detailed anal- that the mental reflexes of these two modes of working are nearly op- ysis [14]. As with ethnography, we differ by advocating that valid posite. We offer a metaphor that technique-driven work is like running progress can be made with considerably less analysis time. Although a footrace, while problem-driven work is like preparing for a violin early proponents [87] cautioned against beginning the analysis pro- concert: deciding when to perform is part of the challenge and the cess with preconceived notions, our insistence that visualization re- primary hazard is halting before one’s full potential is reached, as op- searchers must have a solid foundation in visualization knowledge posed to the challenge of reaching a defined finish line first. aligns better with more recent interpretations [25] that advocate bring- ing a prepared mind to the project, a call echoed by others [63]. 5COMPARING METHODOLOGIES Many aspects of the action research (AR) methodology [27] align Design studies involve a significant amount of qualitative field work; with design study methodology. First is the idea of learning through we now compare design study methodolgy to influential methodolo- action, where intervention in the existing activities of the collabora- gies in HCI with similar qualitative intentions. We also use the ter- tive research partner is an explicit aim of the research agenda, and minology from these methodologies to buttress a key claim on how to prolonged engagement is required. A second resonance is the identifi- judge design studies: transferability is the goal, not reproducibility. cation of transferability rather than reproducability as the desired out- Ethnography is perhaps the most widely discussed qualitative re- come, as the aim is to create a solution for a specific problem. Indeed, search methodology in HCI [16, 29, 30]. Traditional ethnography in our emphasis on abstraction can be cast as a way to “share sufficient the fields of anthropology [6] and sociology [81] aims at building a knowledge about a solution that it may potentially be transferred to rich picture of a culture. The researcher is typically immersed for other contexts” [27]. The third key idea is that personal involvement many months or even years to build up a detailed understanding of life of the researcher is central and desirable, rather than being a dismaying and practice within the culture using methods that include observation incursion of subjectivity that is a threat to validity; van Wijk makes the What? Datasets Attributes

Data Types Attribute Types What? Items Attributes Links Positions Grids Categorical

Data and Dataset Types Why? Tables Networks & Fields Geometry Clusters, Ordered Trees Sets, Lists Ordinal Items Items (nodes) Grids Items Items Attributes Links Positions Positions How? Attributes Attributes Quantitative

Dataset Types Ordering Direction Tables Networks Fields (Continuous) Sequential Attributes (columns) Grid of positions

Link Items Cell (rows) Node Diverging (item) Cell containing value Attributes (columns)

Value in cell Cyclic Multidimensional Table Trees

Value in cell

Geometry (Spatial) Dataset Availability Static Dynamic

Position 10 Types: Datasets and data

Dataset Types Tables Networks NetworksSpatial

Attributes (columns) Fields (Continuous) Geometry (Spatial)

Items Link Grid of positions (rows) Node (item) Cell Cell containing value Position Node (item) Attributes (columns)

Attribute Types Value in cell Categorical Ordered

Ordinal Quantitative

11 Why? Actions Targets

Analyze All Data Consume Trends Outliers Features Discover Present Enjoy

Attributes Produce Annotate Record Derive One Many

tag Distribution Dependency Correlation Similarity

Extremes Search • {action, target} pairs Target known Target unknown Location Lookup Browse Network Data –discover distribution known Location Locate Explore Topology –compare trends unknown –locate outliers Query Paths –browse topology Identify Compare Summarize What? Spatial Data Why? Shape How? 12 Actions: Analyze, Query Analyze Consume • analyze Discover Present Enjoy –consume • discover vs present – aka explore vs explain • enjoy Produce – aka casual, social Annotate Record Derive –produce tag • annotate, record, derive • query Query –how much data Identify Compare Summarize matters? • one, some, all • independent choices 13 Derive: Crucial Design Choice • don’t just draw what you’re given! –decide what the right thing to show is –create it with a series of transformations from the original dataset –draw that • one of the four major strategies for handling complexity

exports imports trade balance

trade balance = exports −imports Derived Data Original Data 14 Analysis example: Derive one attribute • Strahler number – centrality metric for trees/networks – derived quantitative attribute – draw top 5K of 500K for good skeleton [Using Strahler numbers for real time visual exploration of huge graphs. Auber. Proc. Intl. Conf. Computer Vision and Graphics, pp. 56–69, 2002.]

Task 1 Task 2

.58 .74 .58 .74 .64 .64 .54 .84 .54 .84 .74 .84 .74 .84 .24 .84 .24 .84 .64 .64 .94 .94 In Out In In Out Tree Quantitative Tree + Quantitative Filtered Tree attribute on nodes attribute on nodes Removed unimportant parts

What? Why? What? Why? How? In Tree Derive In Tree Summarize Reduce Out Quantitative In Quantitative attribute on nodes Topology Filter attribute on nodes Out Filtered Tree 15 Targets

All Data Network Data Trends Outliers Features Topology

Paths Attributes One Many Distribution Dependency Correlation Similarity Spatial Data Shape Extremes

16 How?

EncodeEncode Manipulate Manipulate Facet Facet ReduReducece

Arrange Map Change Juxtapose Filter Express Separate from categorical and ordered attributes Color Hue Saturation Order Align Luminance Select Partition Aggregate

Size, Angle, Curvature, ...

Use Navigate Superimpose Embed Shape

Motion Direction, Rate, Frequency, ...

17 How to encode: Arrange space, map channels Encode

Arrange Map Express Separate from categorical and ordered attributes Color Hue Saturation Luminance Order Align Size, Angle, Curvature, ...

Use Shape

Motion Direction, Rate, Frequency, ...

18 Definitions: Marks and channels

• marks Points Lines Areas – geometric primitives

Position Color • channels Horizontal Vertical Both – control appearance of marks

Shape Tilt

Size Length Area Volume

19 Encoding visually with marks and channels • analyze idiom structure – as combination of marks and channels

1: 2: 3: 4: vertical position vertical position vertical position vertical position horizontal position horizontal position horizontal position color hue color hue size (area)

mark: line mark: point mark: point mark: point

20 Channels: Expressiveness types and effectiveness rankings

Magnitude Channels: Ordered Attributes Identity Channels: Categorical Attributes Position on common scale Spatial region

Position on unaligned scale Color hue

Length (1D size) Motion

Tilt/angle Shape

Area (2D size)

Depth (3D position)

Color luminance

Color saturation

Curvature

Volume (3D size) 21 Channels: Matching Types

Magnitude Channels: Ordered Attributes Identity Channels: Categorical Attributes Position on common scale Spatial region

Position on unaligned scale Color hue

Length (1D size) Motion

Tilt/angle Shape

Area (2D size) • expressiveness principle Depth (3D position) – match channel and data characteristics Color luminance

Color saturation

Curvature

Volume (3D size) 22 Channels: Rankings

Magnitude Channels: Ordered Attributes Identity Channels: Categorical Attributes Position on common scale Spatial region

Position on unaligned scale Color hue

Length (1D size) Motion

Tilt/angle Shape

Area (2D size) • expressiveness principle Depth (3D position) – match channel and data characteristics Color luminance • effectiveness principle Color saturation – encode most important attributes with

Curvature highest ranked channels

Volume (3D size) 23 How?

EncodeEncode Manipulate Manipulate Facet Facet ReduReducece

Arrange Map Change Juxtapose Filter Express Separate from categorical and ordered attributes Color Hue Saturation Order Align Luminance Select Partition Aggregate

Size, Angle, Curvature, ...

Use Navigate Superimpose Embed Shape

Motion Direction, Rate, Frequency, ...

24 How to handle complexity: 3 more strategies + 1 previous

Manipulate Facet Reduce Derive

Change Juxtapose Filter

Select Partition Aggregate • change view over time • facet across multiple views Navigate Superimpose Embed • reduce items/attributes within single view • derive new data to show within view 25 How to handle complexity: 3 more strategies + 1 previous

Manipulate Facet Reduce Derive

Change Juxtapose Filter

Select Partition Aggregate • change over time - most obvious & flexible of the 4 strategies

Navigate Superimpose Embed

26 How to handle complexity: 3 more strategies + 1 previous

Manipulate Facet Reduce Derive

Change Juxtapose Filter

Select Partition Aggregate • facet data across multiple views

Navigate Superimpose Embed

27 Idiom: Linked highlighting System: EDV • see how regions contiguous in one view are distributed within another –powerful and pervasive interaction idiom

• encoding: different • data: all shared

[Visual Exploration of Large Structured Datasets. Wills. Proc. New Techniques and Trends in Statistics (NTTS), pp. 237–246. IOS Press, 1995.]

28 Idiom: bird’s-eye System: Google Maps • encoding: same • data: subset shared • navigation: shared –bidirectional linking

• differences –viewpoint –(size)

• overview-detail [A Review of Overview+Detail, Zooming, and Focus+Context Interfaces. Cockburn, Karlson, and Bederson. ACM Computing Surveys 41:1 (2008), 1–31.]

29 Idiom: Small multiples System: Cerebral • encoding: same • data: none shared –different attributes for node colors –(same network layout) • navigation: shared

[Cerebral: Visualizing Multiple Experimental Conditions on a Graph with Biological Context. Barsky, Munzner, Gardy, and Kincaid. IEEE Trans. Visualization and Computer Graphics (Proc. InfoVis 2008) 14:6 (2008), 1253–1260.] 30 Coordinate views: Design choice interaction

All Subset None

Overview/ Same Redundant Detail Small Multiples

Multiform, Overview/ No Linkage Multiform Detail • why juxtapose views? –benefits: eyes vs memory • lower cognitive load to move eyes between 2 views than remembering previous state with single changing view

–costs: display area, 2 views side by side each have only half the area of one view 31 Idiom: Animation (change over time) • weaknesses –widespread changes –disparate frames • strengths –choreographed storytelling –localized differences between contiguous frames –animated transitions between states

32 How to handle complexity: 3 more strategies + 1 previous

Manipulate Facet Reduce Derive

Change Juxtapose Filter

Select Partition Aggregate • reduce what is shown within single view

Navigate Superimpose Embed

33 Reduce items and attributes Reducing Items and Attributes Reduce • reduce/increase: inverses Filter Filter Items • filter –pro: straightforward and intuitive Aggregate Attributes • to understand and compute

–con: out of sight, out of mind Embed • aggregation Aggregate –pro: inform about whole set Items –con: difficult to avoid losing signal

• not mutually exclusive Attributes –combine filter, aggregate –combine reduce, facet, change, derive 34 pod, and the rug looks like the seeds within. Kampstra (2008) also suggests a way of comparing two

groups more easily: use the left and right sides of the bean to display different distributions. A related idea

is the raindrop plot (Barrowman and Myers, 2003), but its focus is on the display of error distributions from

complex models.

Figure 4 demonstrates these density boxplots applied to 100 numbers drawn from each of four distribu-

tions with mean 0 and standard deviation 1: a standard normal, a skew-right distribution (Johnson distri-

bution with skewness 2.2 and kurtosis 13), a leptikurtic distribution (Johnson distribution with skewness 0

and kurtosis 20) and a bimodal distribution (two normals with mean -0.95 and 0.95 and standard devia-

tion 0.31). Richer displays of density make it much easier to see important variations in the distribution: Idiom: boxplot multi-modality is particularly important, and yet completely invisible with the boxplot. • static item aggregation ! !

! ! 4 4

! • task: find distribution ! ! 4 ! 4

! !

• data: table ! !

2 2 !

! 2 • derived data 2 0 0 –5 quant attribs 0 0

! ! median: central line ! • ! !

! 2

! 2 ! ! ! !

2 2 !

• lower and upper quartile: boxes ! ! ! ! !

!

• lower upper fences: whiskers ! ! 4 ! 4 ! – values beyond which items are outliers n s k mm n s k mm n s k mm n s k mm –outliers beyond fence cutoffs explicitly shown Figure 4: From left to right: box plot, vase plot, violin plot and bean plot. Within each plot, the distributions from left to [40 years of boxplots. Wickham and Stryjewski. 2012. had.co.nz]

right are: standard normal (n), right-skewed35 (s), leptikurtic (k), and bimodal (mm). A normal kernel and bandwidth of

0.2 are used in all plots for all groups.

A more sophisticated display is the sectioned density plot (Cohen and Cohen, 2006), which uses both

colour and space to stack a density estimate into a smaller area, hopefully without losing any information

(not formally verified with a perceptual study). The sectioned density plot is similar in spirit to horizon

graphs for time series (Reijner, 2008), which have been found to be just as readable as regular line graphs

despite taking up much less space (Heer et al., 2009). The density strips of Jackson (2008) provide a similar

compact display that uses colour instead of width to display density. These methods are shown in Figure 5.

6 Idiom: Dimensionality reduction for documents • attribute aggregation –derive low-dimensional target space from high-dimensional measured space

Task 1 Task 2 Task 3

wombat

In Out In Out In Out HD data 2D data 2D data Scatterplot Scatterplot Labels for Clusters & points Clusters & points clusters

What? Why? What? Why? How? What? Why? In High- Produce In 2D data Discover Encode In Scatterplot Produce dimensional data Derive Out Scatterplot Explore Navigate In Clusters & points Annotate Out 2D data Out Clusters & Identify Select Out Labels for points clusters 36 What? Datasets Attributes domain abstraction Data Types WAhttribuy? te Types Items Attributes Links Positions Grids Categorical Actions Targets

Data and Dataset Types Analyze All Data Tables Networks & Fields Geometry Clusters, Ordered idiom TreesConsume Sets, Lists OrdinalTrends Outliers Features Items Items (nodes)DiscoverGrids PresentItems EnjoyItems algorithm Attributes Links Positions Positions Attributes Attributes Quantitative

Attributes How? Dataset Types Produce Ordering OneDirection Many Tables AnnotaNetteworks Record FieldsEncD ode(eriCEnonvtinuous)ecode Manipulate Manipulate Facet Facet ReduReducece SequentialDistribution Dependency Correlation Similarity tag Grid of positions Attributes (columns) Arrange Map Change Juxtapose Filter Link from categorical and ordered Items ExpressCell Separate (rows) attrDiibuvetesrging Node Extremes (item) Cell containingS veaalue rch Attributes (columns) Color Hue Saturation Luminance Select Partition Aggregate Order ValueA inli cgelln Cyclic Multidimensional Table TreesTarget known Target unknown Location Size, Angle, Curvature, ... Lookup Browse Network Data known Use Navigate Superimpose Embed Location Value in cell Locate Explore Topology unknown Shape

Geometry (Spatial)Query Motion Paths Direction, Rate, Frequency, ... Identify Compare Summarize What? Position Spatial Data 37 Why? A quick taste of my own work!

technique-driven problem-driven work work

theoretical foundations

evaluation

38 T P Technique-driven: Graph drawing F E James Slack Kristian Hildebrand

TreeJuxtaposer

David Auber Daniel Archambault (Bordeaux)

TopoLayout SPF Grouse GrouseFlocks TugGraph

39 T P Evaluation: Graph drawing F

Joanna McGrenere E Dmitry Nekrasovski Adam Bodnar (UBC)

Stretch and squish navigation Joanna McGrenere Jessica Dawson (UBC)

Search set model of path tracing

40 Technique-driven: T P F Dimensionality reduction E

Stephen Ingram

Glimmer DimStiller

Glint QSNE 41 T P Evaluation: Dimensionality reduction F Melanie Tory E

Points vs landscapes for dimensionally reduced data Melanie Tory Guidance on DR & scatterplot choices Michael Sedlmair (UVic)

Taxonomy of cluster separation factors 42 T P Problem-driven: Genomics F E Jenn Gardy Robert Kincaid Aaron Barsky (Microbio) (Agilent)

Cerebral

source: Human chr3 destination: Lizard chrY chr1 chr3 chr3 chrX (Harvard) chr22 237164 146709664 chr21 10Mb chr2

chrh

chrg chrf chrd chr20 chrc chrb chra chr3

chr19

chr6

chr18 chr3

chr17

chr5

chr16

chr1

chr4

chr15

chr14

chr4

chr5 386455 146850969

chr13 out in invert

chr2 chr12 chr3 chr6

chr11 chr7 line orientation: saturation chr10 match - + chr8 chr9 go to: inversion

MizBee MulteeSum, Pathline 43 T P Problem-driven: Genomics, fisheries F E

Cydney Nielsen Joel Ferstay (BC Cancer)

Variant View Torsten Moeller Maryam Booshehrian (SFU)

Vismon 44 T P Problem-driven: Many domains F Diane Tang E Heidi Lam (Google)

SessionViewer: web log analysis

Stephen North Peter McLachlan (AT&T Research)

LiveRAC: systems time-series 45 T P Evaluation: Focus+Context F Ron Rensink Heidi Lam (UBC) E

Distortion impact on search/memory

Robert Kincaid Heidi Lam (Agilent)

Separate vs integrated views 46 T P Journalism F Jonathan Stray E Matt Brehmer Stephen Ingram (Assoc Press)

Overview

Johanna Fulda (Sud. Zeitung) Matt Brehmer

TimeLineCurator

47 T P Theoretical foundations F • Visual Encoding Pitfalls • Strategy Pitfalls domain E - Unjustified Visual Encoding - What I Did Over My Summer abstraction - Hammer In Search Of Nail - Least Publishable Unit - 2D Good, 3D Better - Dense As Plutonium idiom - Color Cacophony - Bad Slice and Dice algorithm - Rainbows Just Like In The Sky Nested Model Papers Process & Pitfalls Michael Sedlmair Miriah Meyer

Design Study Methodology

Matt Brehmer

Abstract Tasks 48 1990-1995

The Shape of Space Outside In

Geomview Charlie Gunn Stuart Levy Mark Phillips Delle Maxwell

49 More Information @tamaramunzner • this talk http://www.cs.ubc.ca/~tmm/talks.html#vad16nasa

• book page (including tutorial lecture slides) http://www.cs.ubc.ca/~tmm/vadbook – 20% promo code for book+ebook combo: HVN17 – http://www.crcpress.com/product/isbn/9781466508910

: Eamonn Maguire

• papers, videos, software, talks, courses http://www.cs.ubc.ca/group/infovis http://www.cs.ubc.ca/~tmm Visualization Analysis and Design. Munzner. A K Peters Visualization Series, CRC Press, Visualization Series, 2014. 50