designing for TOUCH josh clark @bigmediumjosh

Learn more about the book, Designing for Touch: http://abookapart.com/products/designing-for-touch Brief books for people who make websites No 14 15% discount Josh Clark DESIGNING FOR TOUCH EVOLVEDFT foreword by Brad Frost abookapart.com We’re here to talk about the science of interface design. It used to be straightforward. Mouse and cursor everywhere. Same inputs.

That’s changing. Just as the shape and nature of our screens are changing—our outputs— we’re now wrangling a whole diferent set of inputs too.

Touch certainly the most popular alterative input method. Not just on our mobile screens... hands and fingers are coming to life on our desktops, too. As advertised, this workshop is about designing for touch.

We’ve been dealing with touch on , and tablets are coming on strong.

So sure, I’ll talk about guidelines for designing on phones and tablets. But perhaps more important, I’ll also talk about designing for touch on the desktop.

[next] Because friends, touch has erupted on the desktop, with touchscreen , with hybrid tablet devices like Surface or iPad Pro w/keyboard. no, it’s okay. TOUCH THAT

As advertised, this workshop is about designing for touch.

We’ve been dealing with touch on smartphones, and tablets are coming on strong.

So sure, I’ll talk about guidelines for designing on phones and tablets. But perhaps more important, I’ll also talk about designing for touch on the desktop.

[next] Because friends, touch has erupted on the desktop, with touchscreen laptops, with hybrid tablet devices like Surface or iPad Pro w/keyboard. Not just hardware but software.

Windows and Chrome OS elevate touch as a first-class citizen on desktop operating systems.

And that has some thorny implications for responsive design.

ULTIMATELY, IT MEANS THIS: All desktop designs have to be touch-friendly, not just mobile.

But first...let’s begin at the beginning: With the basics, with first principles. Not just hardware but software.

Windows and Chrome OS elevate touch as a first-class citizen on desktop operating systems.

And that has some thorny implications for responsive design.

ULTIMATELY, IT MEANS THIS: All desktop designs have to be touch-friendly, not just mobile.

But first...let’s begin at the beginning: With the basics, with first principles. Not just hardware but software.

Windows and Chrome OS elevate touch as a first-class citizen on desktop operating systems.

And that has some thorny implications for responsive design.

ULTIMATELY, IT MEANS THIS: All desktop designs have to be touch-friendly, not just mobile.

But first...let’s begin at the beginning: With the basics, with first principles. I have an ugly truth for you, friends.

To date, we’ve thought about web design as a visual pursuit.

A feat of visual design, of information design. So we naturally approach our work in visual terms, we think in pixels.

But when you add touch to the mix... we go beyond the visual... and to the physical. So we naturally approach our work in visual terms, we think in pixels.

But when you add touch to the mix... we go beyond the visual... and to the physical. It’s not just how your pixels LOOK but how they FEEL

It’s not just how your pixels look. But how they feel. How do your pixels feel in the hand? designing for TOUCH

• Touchscreen layouts • Gestures Let’s go back just a few years. To the release of a revolutionary new phone.

The thing worked in a wholly diferent way from what came before, but newcomers and technophiles alike were charmed by it.

Industry observers called it intuitive, efcient, even fun. The new phone quickly became a status symbol, owned by a select few at first.

As time went by, nearly everyone got one, and now we find its operation so natural that we can barely imagine phones working any other way.

You guys know the revolutionary new gadget I’m talking about. Bell Telephone’s remarkable Push-Button phone. In time we’d call it TouchTone. The year was 1963.

In retrospect, its layout seems so obvious. So familiar, no other design seems possible. But that wasn’t the case. ---- Bell researchers tested 16 diferent keypad variations, searching for the design that enabled the fastest, most reliable dialing. So familiar, no other design seems possible. But that wasn’t the case. ---- Bell researchers tested 16 diferent keypad variations, searching for the design that enabled the fastest, most reliable dialing.

Bell Lab scientists brought customers into the laboratory.

Telephone usability testing. They measured dialing speeds to fractions of a second. They asked callers about the comfort and feel of the keypads. They played with button tension and whether buttons should make a click sound when pressed.

So they were certainly interested in the keypad’s visual layout BUT EVEN MORE CRUCIALLY... they were concerned with its feel, its ease of use by the human hand.

The layout of the buttons had to be MORE than aesthetically pleasing; it had to be optimized for physical use. Fast forward to today, and we’re learning the same lessons all over again. You’re not “just”a visual designer.

That means you’re not just a visual designer anymore. You’re an industrial designer.

You’re also an industrial designer.

Or at least your job now includes aspects of industrial design, the art of crafting a physical thing.

Because when you design a touchscreen interface, you have real, honest ergonomics to consider. In a very real sense, you’re designing a physical device.

Not literally. Virtual, flickering liquid crystals

But because they’re explored by human hands— unlike desktop experiences to date— you’re designing how hands interact Been dealing with this on mobile for 5 years.

Phones and tablets confront us with a blank slate. Invite us to impose our own interface, ANY INTERFACE.

Because that interface requires touch, defines device in very physical way.

Soldering circuit boards, molding plastic, diecast Real ergonomics: Again how does it feel in your hand? 15 49% 36% %

htp://www.uxmaters.com/mt/archives/2013/02/how-do-users-really-hold-mobile-devices.php

Basically three ways to hold a phone. [next]

Steven Hoober did a field study, observed over 1300 people in the street tapping away at their phones.

Big plurality, nearly half, were tapping with one hand, with their thumb.

We switch between these grips often, very contextual. The most popular though is that one-handed grip.

Gives us freedom to use the other hand. To write, to hold cofee, to juggle a baby.

But I want to pause to look at that middle one, too, where we hold the phone in one hand and jab with the other. http://www.uxmatters.com/mt/archives/2013/02/how-do-users-really-hold-mobile-devices.php This picture shows a finger doing the tapping. Turns out that most of the time, when we use this hold and stab posture, it’s not actually the finger that does the work. Instead, Steven observed: 72% of the time people use the cradle grip, WE USE OUR THUMB. 75 % of phone interactions

Add it up, and that means 75% of our phone interactions are these two grips.

Three quarters of the time, WE TAP THE SCREEN WITH ONE THUMB.

We often talk about how we’re designing for fingers... All thumbs.

In reality, we’re designing for thumbs. As we’ll see, that truth cuts across other device types, too.

Thumbs are awesome. But they have their limits. And this is it: the thumb-tapping hot zone.

Fan-shaped comfort zone for thumb when held in right hand.

Bottom of the screen opposite side of the thumb— left side for the right thumb— where tapping is easiest, where thumb naturally rests

Obviously thumb can reach anywhere onscreen, but this is most comfortable zone. Important implications: That’s why iOS puts primary controls at bottom.

Turns desktop convention on its head

LEFT VS RIGHT Steven’s study found that two-thirds of the time, we tap the screen with the right hand.

But very fluid about which hand we use. More important bit is top vs bottom.

And look what’s NOT in the thumb zone: edit button. Twitter does something similar: navigation at bottom, and the New Tweet button at top. Because you don’t want to let tweets get away from you accidentally. You don’t HAVE to put data-changing buttons at top right.

If it’s the primary action that you want people to take over and over again, then put it down at the bottom.

That’s what Swarm does for checkin.

(Twitter’s post button actually isn’t the primary action; navigation/reading is primary.) Here are a few examples. Feet at bottom, scale at top.

Design to keep fingers out of way of content. Hands, feet, fingers at bottom to make room for display at top.

Consider this a cardinal rule: put controls below content. Labels and active states too. Move them out of the way of the obscuring finger. Familiar pattern for keyboards.

(And evidence of why you have to test your design on the device, not just draw it on desktop.)

We’ve been focusing on iOS examples. In iOS, controls sink to screen bottom. But it all depends on who gets there first.

So the story changes a little bit for Android. Complicated by the presence of system buttons on the bottom.

They’re doing the right thing: controls below content.

The trouble is, where do you put YOUR app controls.

The operating system beat you to the bottom of the screen. Do you stack them on top like you see here?

Always bad to stack controls in touch devices —invites errors and mistaps— but it’s worst to do it at screen bottom.

Heavy thumb trafc and obscured view (hovering thumb) increases error at bottom. Instagram puts the primary action right up against the Home button. A minor mistap and you quit the whole application. Facebook for iOS (left) puts navigation options at bottom.

Android moves them up to top to get away from system buttons. But it also bumps the status into the canvas and it scrolls away.

That extra system toolbar means you lose some space for controls. Android claims valuable real estate. Facebook for iOS (left) puts navigation options at bottom.

Android moves them up to top to get away from system buttons. But it also bumps the status into the canvas and it scrolls away.

That extra system toolbar means you lose some space for controls. Android claims valuable real estate. SMALL PHONES

iOS apps Controls at screen botom Android apps Controls at screen top

Started with very simple concept: Content at top, controls at bottom.

But it turns out, environment matters. Who gets there first.

And as a result, you get diferent rules for diferent platforms. But look! After years of saying toolbars must go at top, Google this year officially sanctioned bottom navigation. https://www.google.com/design/spec/components/bottom-navigation.html#bottom-navigation-behavior Google Plus did this a few months ago.

I’m not a fan, but I understand the need. Samsung’s 7” Galaxy W “phone” — the are here! iPhone 6s and 6s plus Google’s Nexus 6 is even bigger.

So let’s return to our key question: where do hands and fingers (and thumbs) come to rest on these devices? 74%

htp://www.elearningguild.com/research/archives/index.cfm?id=174&action=viewonly

Hoober and Patti Shank found that 74 percent of owners work their screens with two hands.

Compared with just 51% 2-handed use on smaller phones. 35%

htp://www.elearningguild.com/research/archives/index.cfm?id=174&action=viewonly

The most-used grip (35 percent) has people holding their phablets in one hand while tapping with the index finger of the other.

So does that mean the thumb zone doesn’t apply? htp://www.elearningguild.com/research/archives/index.cfm?id=174&action=viewonly

But look at all of these grips. Where thumbs are doing the work. 59 % thumbs on phablets htp://www.elearningguild.com/research/archives/index.cfm?id=174&action=viewonly EASY EASY OKAY OKAY EASY OKAY

One-handed thumb zone for phablets.

Something weird happens as the phone gets larger. The thumb zone gets smaller, because the way you have to anchor the thing with your pinky.

EASY EASY OKAY OKAY EASY OKAY

Choke up to get at higher areas on the phone.

No matter what your handhold, the top of a phablet screen is always going to be a difcult reach, particularly for one-handed use.

As with smaller phones, it’s ideal to follow the “content above controls” rule to keep frequent tap targets within reach and out of the way of content. OKAY

EASY

EASY OKAY

Choke up to get at higher areas on the phone.

No matter what your handhold, the top of a phablet screen is always going to be a difcult reach, particularly for one-handed use.

As with smaller phones, it’s ideal to follow the “content above controls” rule to keep frequent tap targets within reach and out of the way of content. 12:30 Screen Title

TAB ONE TAB TWO TAB THREE

12:30

Screen Title

TAB ONE TAB TWO TAB THREE

The change here is for Android; instead of putting all Android controls at screen top as you would for smaller phones, phablet interfaces can slide some of those frequent controls down to screen bottom by following the “split action bar” design pattern. Moves frequent controls down to a separate toolbar at screen bottom.

Originally created for small screens that couldn’t fit everything at top. Finding new utility in phablets where you can’t reach.

THIS IS A COMPROMISE. Stacking controls at bottom risks tap errors. But at least they can reach the controls. Have to choose the least evil. Another approach is to supplement top-of-screen buttons with bottom-of-screen interactions.

So here, for example, in Android’s People app, you can tap the tabs at the top of the screen—if you’re able to reach.

...but in the alternate gesture universe... You can also swipe.

So you have it both ways. Controls at top—appropriate to android—but interaction at bottom. A floating trigger button can be another useful workaround here.

These buttons tuck into the bottom corner of the screen, and stay fixed in place, hovering above the the rest of the interface as it scrolls past.

Because this button doesn’t extend full width, the stacking penalty on Android is far smaller than it is for a full toolbar.

In Android’s UI lingo, this trigger button is called a floating action button; that’s what we’re looking at here.

Android’s Material Design guidelines for the button suggest that you space it far enough from the system buttons that the stacking risk is minimal.

When my team designed Entertainment Weekly’s responsive mobile website, for example, we used a floating trigger button to ofer quick access to sharing tools for all screen sizes from tiny phone to large desktop. A floating trigger button can be another useful workaround here.

These buttons tuck into the bottom corner of the screen, and stay fixed in place, hovering above the the rest of the interface as it scrolls past.

Because this button doesn’t extend full width, the stacking penalty on Android is far smaller than it is for a full toolbar.

In Android’s UI lingo, this trigger button is called a floating action button; that’s what we’re looking at here.

Android’s Material Design guidelines for the button suggest that you space it far enough from the system buttons that the stacking risk is minimal.

When my team designed Entertainment Weekly’s responsive mobile website, for example, we used a floating trigger button to ofer quick access to sharing tools for all screen sizes from tiny phone to large desktop. Tap the button once to reveal a toolbar of sharing options. Samsung introduced a one-handed mode for its jumbo Galaxy Note phones and now on its Samsung Galaxy S5. Actually shrinks the screen to make it size of a regular phone.

This feels like giving up.

A few other features, though, do a little bit more of a nudge approach. Special settings let you shrink keyboard or phone pad and then bring it within reach of your thumb. https://www.youtube.com/watch?v=mrxrZ8Y0-XE Apple has something similar called “Reachability” for iPhone 6 and 6 Plus. Double-touch the home button to slide the whole screen down within reach.

The idea: if your thumb can’t reach the screen, bring the screen to the thumb. https://www.youtube.com/watch?v=r3CTKI6pRlQ FUN WITH PHABLETS Favor screen botom Botom gestures supplement top controls Move controls within reach

In addition to tabs at top, swipe through views.

Anchor link nav: The top-corner position staked out by a menu link will admittedly be out of reach for a single thumb, but it also turns out that navigation menus are only rarely used by web users.

Cross-phablet stretch: Avoid tap targets that hug the left or right edge too tightly. People love that middle of screen. In the main body of the page, favor targets that stretch at least a third into the screen, if not the full width.

Mountain to muhammad: Most interfaces are fixed landscapes that we roam across. We move to the buttons; they never come to us. Samsung, for example, created a special “one-handed operation” mode for its jumbo Android phones Apple’s reachability.

Instead of sliding the entire interface up or down, though, a more practical use in webpages is to slide menus in from the edges.

As mentioned earlier, though, side tabs are less than ideal for phablets, where side edges are both outside the comfortable thumb zone. Sides are still far easier to hit than the top, making this sliding panel solution better than top-of-screen controls. (Better for tablets) EASY

OKAY Instapaper. Here you see controls at the top of the screen. Closer to desktop standards we’re used to. More visually available at the top.

Turns out it’s better ergonomically, too.

In general, your grip will tend to be around middle of screen, so thumbs at top 1/3.

You want to avoid putting controls at center where your hand roams over the content, obscures it.

Instead, put those controls in reach of where thumbs come to rest. Don’t make me haul my giant meat pointer. In portrait: The side works well for main navigation or other primary controls. For landscape, consider controls at side, too. Where you’ll be holding it, and you have plenty of room. An ebook app. Some bottom of page stuf is fine, especially when you’re working on something that will change the content in the main frame.

Not ideal for quick access to tools, or frequent taps.

But when you need a control to do browsing or action on a screen, bottom works well. TABLET GUIDELINES

Favor sides and corners Avoid top center Use botom only to see efect in canvas “The Hybrids”

Now we’re onto something at once thrilling and awful. The greatest horror our screens have ever experienced.

Oh yes, friends, it is a terrifying combination indeed. Touchscreen AND keyboard. Sometimes touchscreen detachable to make it a tablet. Sometimes fold it back to make it keyboard free. Other times it’s always attached.

All combine keyboard and touch at some point.

Similar devices have been around for a while, but not in large numbers, so research and design practices are a bit rare. That said, some ergonomic behaviors are already clear.

1. Mouse/trackpad use drops way of. 2. You get people touching directly almost all the time. 3. Scrolling, selection, even form fields.

People accustomed to tabbing between forms switch, touch fields directly even if less efcient.

Turns out that touch is a powerful invitation. people use it,

But wow, that seems physically uncomfortable, right? Steve Jobs always said nobody will ever want to use one of these screens. Nobody wants to work like this, with gorilla arms.

And it turns out people actually DON’T do gorilla arms. EASY

OKAY

One or both hands gripping the bottom of the . If one hand, using the other to tap or scroll with index finger.

Support for floppy cover. So this is the hot zone.

This is why in Windows you have these gestures to swipe toolbars in from the right side or from the bottom.

Extremely comfortable for thumbs in this position. EASY

OKAY

Confusing things a bit is the fact that using index finger to work the heart of the canvas means this is the thumb zone. Corners are tough.

EXACT OPPOSITE OF THE THUMBZONE.

So what to do? navigation and controls at bottom and sides, or at top and middle? It turns out after frequent use, start to go all thumbs. Use thumbs for all taps, stretching in.

The thumb dance. Do that crazy dance, little guys.

So this thumb pattern tends to be ideal for frequent controls.

Windows 8: swipe from right edge to bring charms bar. (Windows 10: new Notification Center) Swipe up from the bottom for application controls. Xbox music puts navigation at left edge and playback at bottom right. Perfect for thumbs. Facebook puts navigation at left and friends at right. OKAY

EASY

How does this compare with a tablet’s hot zone? If you overlay the two... You get a common area that looks like this.

So the most touch-friendly zones for layout —addressing both tablet and dektop— are the sides.

I don’t know about you... desktop designs I’ve built haven’t optimized for these areas.

We generally optimize for top or middle of screen, right? But top middle is actually most hostile to this thumb zone.

Getting the picture? For all of these form factors, not about making finger-friendly But THUMB FRIENDLY. OKAY

EASY

EASY

OKAY

How does this compare with a tablet’s hot zone? If you overlay the two... You get a common area that looks like this.

So the most touch-friendly zones for layout —addressing both tablet and dektop— are the sides.

I don’t know about you... desktop designs I’ve built haven’t optimized for these areas.

We generally optimize for top or middle of screen, right? But top middle is actually most hostile to this thumb zone.

Getting the picture? For all of these form factors, not about making finger-friendly But THUMB FRIENDLY. OKAY

EASY

EASY

OKAY

How does this compare with a tablet’s hot zone? If you overlay the two... You get a common area that looks like this.

So the most touch-friendly zones for layout —addressing both tablet and dektop— are the sides.

I don’t know about you... desktop designs I’ve built haven’t optimized for these areas.

We generally optimize for top or middle of screen, right? But top middle is actually most hostile to this thumb zone.

Getting the picture? For all of these form factors, not about making finger-friendly But THUMB FRIENDLY. OKAY

EASY

How does this compare with a tablet’s hot zone? If you overlay the two... You get a common area that looks like this.

So the most touch-friendly zones for layout —addressing both tablet and dektop— are the sides.

I don’t know about you... desktop designs I’ve built haven’t optimized for these areas.

We generally optimize for top or middle of screen, right? But top middle is actually most hostile to this thumb zone.

Getting the picture? For all of these form factors, not about making finger-friendly But THUMB FRIENDLY. SEVENmillimeters

BUT HOW BIG, Josh? When we’re designing for touch, how big should touch targets be?

7mm, about ¼” Spread of fingertip as contact point on screen. This is remarkable consistent no matter what the size of your fingers, from babies to the incredible hulk, contact point remains consistent.

But also: Size of target that finger can reliably hit missed taps

25% 1 in 30 (3%) 20% missed taps 1 100 1 15% in ( %) 10% 1 in 200 (0.5%) 5%

target size 3mm 5mm 7mm 9mm 11mm htp://go.microsof.com/fwlink/p/?linkid=242592

Microsoft did some research for Windows—for both desktop and phones— and found very consistent results.

So 7mm pretty good for everyday, 9mm if you’re being super-cautious.

I don’t know about you. For me, mm isn’t exactly my standard css unit. So how do we specify this size?

Well maybe standards can help us out here. http://go.microsoft.com/fwlink/p/?linkid=242592 160dpi 44 44 44 pixels points dp

44 works across all platforms. Web, iOS, Android all have same principle of a resolution-independent unit. They are all exactly the same size but have diferent names. web iOS Android 44 44 44 pixels points dp 48 dp

Android recommends bumping it up to 48. A little over 7-1/2. That compensates for uneven hardware. Makes sure that it will be at least 7mm.

So 44 should technically work in Android. Bump it a little higher to allow some margin of error. 44

Compromise necessary sometimes. Have to get all the letters of the keyboard on the screen. Squeeze to 30 width.

In iOS, 44 number is not just a guideline, but it’s actually cooked everywhere into the operating system. 44 x 30

As long as one dimension is at least 44, can squeeze other to 30.

Practical minimum for tap targets: 44x30 or 30x44.

There is a caveat to this, which is that location on the screen matters. Man enough: had one of these bad boys Gold Data Bank watch. watch. Not only small buttons but too close Aim for 5, get 8 or 2 More wheel of fortune than calc

Not just button size but spacing Closer -> larger 7mm touch targets 2 mm spacing

7mm with 2mm spacing... or 9mm minimum when jammed together. 9mm touch targets

7mm with 2mm spacing... or 9mm minimum when jammed together. Thing is, we’re losing density here. We can’t fit as much content into the screen, we have to show less, which means that we have to do harder thinking about what goes into each view. 11 mm

9 mm

7 mm

9 mm

11 mm

www.elearningguild.com/research/archives/index.cfm?id=174&action=viewonly

In particular, you see touch accuracy dip as you move away from the center. The corners especially.

So even though that bottom left corner is in comfort zone, it’s not always super-accurate. So you can consider boosting corner targets to 11-12mm for best accuracy.

Still, in most cases, that 44 target will do the trick. http://www.elearningguild.com/research/archives/index.cfm?id=174&action=viewonly 44

44

44

iOS cooked 44 in everywhere in the operating system, right from the very first version. Here we’re looking at an app in iOS2.

Nav bar, height of rows in standard list view, tool bar. 88 88

Home screen grid doubles up to 88.

Idea is that organizing in multiples of 44 multiples of finger’s touch surface, app not only looks right but literally feels right

Design to a 44 pixels rhythm.

Gives you easy, to read, easy to touch interface.

How cool is this! All elements sized exactly in proportion to your finger print. For the hand, but also of the hand

Not only for the hand, but of the hand

Every element in proportion, Not only to one another but to the finger itself.

Design to a 7mm rhythm Don’t think 7mm just for buttons but for overall layout

That’s lovely. But THIS is not.... designing for GESTURE What can you count on?

Mentioned before that we have tap and swipe on the web as the only reliable gestures.

Not tons better in apps, to be honest, but we do have a set of foundational, primitive gestures that we can rely on now.

Let’s just run through those. Tap, activate, show me more. This is the ask-the-question thing that drives everything.

Hover substitute. Swipe. Almost always next/previous. That can mean scrolling, or it can mean flipping cards. But there are somewhat creative ways to use it, too:

To cross something out, for deletion. To slide a hidden panel, reveal actions below, or to reveal a drawer. We’ll see some examples of those more advanced interactions in a bit. Double tap.

Zoom in and out.

Android adds its own wrinkle to it that you can double tap and on that second tap slide up and down to control how much you’re going to zoom. But the core gesture is double-tap. Pinch and spread, of course, the more playful zoom in/out gesture.

Can also be used to close or open a view, and we’ll see examples of that coming up. Tap and hold brings up contextual actions or a contextual menu. Treat this as the right click of touch screens. Not all users discover it. Often a shortcut to clicking through to the item to see what actions you can perform on it.

Long press and drag is more well known -- the gesture for drag and drop across all platforms.

That covers standard, easily discoverable gestures. But what about the fancy stuf, the tricky, abstract gestures? These are point and click gestures.

These five gestures are efective, they work just fine. But they are primitive, limited.

They were designed to port existing mouse-and-cursor interactions to the touchscreen. They maintain the status quo. With the exception of pinch and spread, it’s a one-to-one relationship with click/drag/scroll desktop actions.

They are exactly as expressive as a mouse cursor, sidestepping the hand’s subtlety and reducing it to a single jabbing finger. Hands are far more expressive than this.

Don’t limit them to pushing buttons.

These basic gestures don’t take full advantage of the medium. They continue to bring us back to old, troubled desktop metaphors.

Sign language interpreter for rapper Kendrick Lamar. Audio in this clip is really bad, probably for the best. For those of you who read sign language, I apologize, I suspect this is a little raunchy. https://www.youtube.com/watch?v=k_cnlQmsScU

THAT is expressive. So much said in the hands. So why are we limiting ourselves to simple point and click interactions?

Part of the limitation is in the device itself. Ultimately it’s just a sheet of glass. And the other piece is that we’re just at the start of creating this vocabulary. We haven’t yet applied our imagination to it. Let’s bring some imagination to this work Super excited about possibilities of touch interfaces.

I believe forces—or should force—important, FUNDAMENTAL changes in how we approach the designs of these interfaces.

Touch will help us sweep away decades of buttons—menus—folders—tabs and administrative debris to work directly with the content. When you remove this thing—the cursor, and the mouse, too, these prosthetics we've pointed at stuf for 25 years— all that remains... is you and the device, or better... you and the content.

Or that’s the illusion we’re able to create. Changes the experience when it feels like you’re working directly with content and objects.

It cuts through complexity to interact directly instead of pumping buttons.

Buttons do add complexity. It’s a problem we always confront as designers. The more features you add, the more interface comes along with it. If we’re not careful, we start drowning in interface chrome.

It’s even in cars. Content

When you remove this thing—the cursor, and the mouse, too, these prosthetics we've pointed at stuf for 25 years— all that remains... is you and the device, or better... you and the content.

Or that’s the illusion we’re able to create. Changes the experience when it feels like you’re working directly with content and objects.

It cuts through complexity to interact directly instead of pumping buttons.

Buttons do add complexity. It’s a problem we always confront as designers. The more features you add, the more interface comes along with it. If we’re not careful, we start drowning in interface chrome.

It’s even in cars. So, buttons, what’s so bad about buttons? They’ve served us well, but touch gives us the opportunity to make up for their downsides. Let’s take a look at the trouble with buttons. Butons add complexity

The more features you have, the more controls it seems you need. If you’re not careful, buttons start sprouting like mushrooms all over your interface. It creates a sense of visual overload. Because it’s primarily a visual interface. Consider console gaming. This is what a game joystick used to look like. Now we have 11 buttons, two triggers, two joysticks and a D-pad controller.

That was the state of the art when mobile gaming began in earnest. And so you saw game designers port this complex button system to touchscreen games. Earthworm Jim. Look at all those buttons. So we simplified. Angry Birds was a single swipe on the screen.

But it was an important first step to say, this is new, we need new interactions. Now we’re getting into new cases, really complex gameplay suited to the touchscreen.

In the game Republique, you’re an unseen hacker who helps guide a woman to escape from a mysterious facility.

You view the world through many surveillance cameras and tell her where to go and when to move.

Most actions involve tapping on cameras to take them over, and then tapping on locations to tell the escapee where to go.

The world itself is the control, no buttons required.

All genres of interface design should explore a similar shift in perspective. Touch begs for direct interaction with the content itself.

So buttons add complexity. What else. Butons are a HACK

They are NOT direct interaction with the content you want to work with.

In the real world AND in software, buttons are abstractions. Work at a distance on the primary object. Often necessary, the best available solution Light switches. Operate at a distance. Add a middle man, an extra layer of complication.

Touch: get rid of visual abstractions to work with content directly.

Not saying they’re evil or bad.

Recognize that buttons are workarounds.

As we get new interaction models like touch, speech, natural gestures, and facial recognition, it’s worth asking: do I still need that hack? Is there an opportunity to manipulate content more directly?

Or at least as backups. In 2013, touchscreen browsers began to do the right thing, when Google Chrome, Mobile Safari, and others added gestural browsing.

After decades of using a physical metaphor—the page—to describe the web, we finally got a physical interaction to navigate it. The app designed to really encourage exploration. Interact directly with words, URLs, entire content contexts.

[flip through]

Tap the content, slide these views around. Paw thru your history. Each slide is a representation of the content. You’re sliding the content back and forth, like cards. No back button needed.

Key bit to designing for touch is imagine: how would data objects look and behave if they were suddenly physical? slips of paper on my desk. Match your INTERACTION METAPHOR to your INFORMATION METAPHOR

The interaction metaphor should match the information metaphor.

Still have buttons for the basic actions. Direct calls to action. That’s when you’re embracing that physical metaphor.

There’s still a use for more abstract or hidden gestures. Misft Shine

Shine is a product from a company called Misfit Wearables. It’s a fitness tracker like Nike Fuelband or Fitbit, but it LOOKS better. Most wearables aren’t very wearable. This one’s diferent, an elegant metal disc. Wear it like a pendant or a watch.

And watch how you sync it. [Video] Just put it right on the screen and the phone seems to absorb the data. Like it’s the physical touch that does it.

In fact, it’s a piece of clever misdirection. It’s actually syncing via Bluetooth, has nothing to do with the screen.

Because of the aluminum housing, the Shine’s bluetooth range is very short. Have to get it very close to the phone to sync, so they came up with this illusion that it syncs by putting it on the screen. Can’t get much closer than that.

Detects the touch on the screen and triggers a wireless sync. http://bitly.com/shine-sync Skeuomorphism out of vogue -- largely because of visual kitsch. This feels old, clumsy now. But! you naturally know how to use it. Just swipe to turn the page.

To turn the page, you have to use one of these tiny buttons at bottom.

You have to embrace your interface metaphor. If you’re going to make it look like a book, make it act like one. Contacts: Tapping doesn’t turn page Swipe deletes

Your interface metaphor suggests how to use the app. Here, the book metaphor is a confusing misdirection. Creates expectation that works like a book, but really through desktop-style buttons. Don’t do “looks like” IF YOU CAN’T DO “ACTS LIKE” ACTS LIKE is more important than “looks like” Let’s stick with old-school for a moment.

Great proof of concept. Changes the way you approach the interface.

This is exactly the way we need to start moving, not to blow up all UI conventions outright, but to question them and ask:

In light of direct interaction, do I still need that button, or is there a better way to do it? If this data were a physical object, how would I stretch and change it?

Developer inspired by music. The idea of building a muscle memory. Reinforce MUSCLE MEMORY Radial menus are a good example. They’re a simple hold and swipe maneuver.

At first blush, these things might seem more complicated than a plain old tool bar, full of visual information to process.

At their core, though, radial menus are essentially gesture-based: touch-swipe-release (not tap and then tap again, it’s all a single gesture).

That’s why some call radial menus “marking menus”: it’s like making a mark on the screen. Swiping to 2 o’clock has one meaning, and swiping to 6 o’clock another.

This means you get faster with radial menus over time, because they take advantage of muscle memory in a way that list-based menus cannot. Studies from 1988 and this one from 1994 found radial menus to be over three times faster to use than a regular menu.

They’re fast. http://globalmoxie.com/blog/radial-menus-for-touch-ui.shtml We’re starting to see them more and more in native apps -- and even cooked into the operating systems.

Android has a “quick control” feature that lets you swipe in from the edge to bring up a radial menu of frequent tools. They’re in the iOS Messages app.

Radial menus have been slower to come to the web than those system and app environments, even though this interaction is well suited to the web—as well as to browser capabilities. It’s still only pictures under glass Text

Embrace the card metaphor

One strategy is to embrace the card metaphor. Cards, tiles, panels are the core metaphor of the major operating systems and most successful apps.

Twitter, Clear. Swiping, flipping cards.

Turns out if you swap “cards” for “pages,” that’s the web’s metaphor, too.

Like cards, pages also suggest a direct physical metaphor, but even the first gen of touchscreen browsers made you use a button to browse pages, confusing the metaphor.

In 2013, touchscreen browsers began to do the right thing, when Google Chrome, Mobile Safari, and others added gestural browsing.

After decades of using a physical metaphor—the page—to describe the web, we finally got a physical interaction to navigate it.

What else is possible with that metaphor? What are all the actions you can take on a paper card? What’s the DATA METAPHOR?

Flip through Stretch Turn over Sort Fold in half Crumple Deal Dogear Shufe Toss it aside Stack Bend Facebook Paper

Facebook reimagined its app with an experiment called Paper. They basically went all in on the card metaphor. They asked themselves, what can you do to a card?

Flip, fold, shufe, stack, turn over, stretch, sort, crumple, dogear, toss it aside

Now what do all of those things mean in a digital context? Trouble is, Paper over-did it. A great thought project but when you do all of these things at once, it’s overload. Maybe it was premature. Maybe they would’ve done better to introduce a few of these at a time.

Be gentle to people as you introduce new interactions.

That hard lesson doesn’t take away from the general principle, though. Let the real world be your guide. Borrow familiar symbols or notation

Adobe Proto Adobe Proto was an iPad app that Adobe made for designers. It’s since been replaced by Adobe Comp -- a way to quickly comp UI

Showing Proto, though, for its clever use of notation.

In this case, for drawing wireframes. The first phase of that is literally sketching. All gesture based. Check this out.

This takes a familiar physical model—sketching, applies it as-is to new format. And it’s cool.

It doesn’t have to be a direct visual borrow, though. Honor PHYSICAL CONSTRAINTS

If things we can do in the physical world can be inspirations for gestural interactions, so too can things we can’t.

Always be ready to question a traditional design pattern from the desktop era if it doesn’t square with direct physical interaction. As we embrace these new interfaces, also have to give up old abstractions and mental models that we associate with desktop controls.

Say that you have a simple drawing app.

What if you want a smaller brush? Traditionally you’d have a slider or some brush selector. But the thing is, you have a brush, and it doesn’t change size.

A setting to change my finger’s touch footprint to double or half actual size would just be confusing, hard to get my head around.

Instead of changing the brush size, you change the canvas size. Pinch to zoom in or out. And then draw your next stroke. Finger always keeps its same physical size on the screen. It’s the canvas that changes size.

[slow] When you deal w/touch, have to rethink these familiar abstractions.

Not all gestures are such direct interactions, though. Gestures are the keyboard shortcuts of touch

Some gestures are the keyboard shortcuts of touch. Often supplements to old-school buttons.

What makes them shortcuts? It’s the timesaving aspect is that you can just slap at the whole screen.

But these can be difcult to find. the FIVE- FINGER GRAB the FOUR- FINGER DIPPITY DO the DOUBLE- DIGIT TWIST Shortcut! Swipe up from Me button for direct messages.

So you have to make thing discoverable.

[Example from moment when Twitter intro’d gestures but didn’t explain them]

The need for this kind of help again points up the fact that we don’t yet have gesture conventions. But you can be more subtle, too, and animation can be useful coaching.

USA Today animation, way back in 2008, start of the app store. HINT your physical interactions Look at everything like it’s new Look at everything like it’s new thanks for your BRAINS @bigmediumjosh abookapart.com 15% EVOLVEDFT