“Buttons Were an Inspired UI Hack, but Now We’ve Got Better Options”
The good folks at O’Reilly interviewed me this week about how new technologies change how we should think about interface design as both consumers and designers. It's a long interview, but here's a quick excerpt:
It's a really exciting time for interaction design
because so many new technologies are becoming mature
and affordable. Touch got there a few years ago. Speech
is just now arriving. Computer vision with face recognition
and gesture recognition like Kinect are coming along.
So, we have all these areas where computers are learning
to understand our particularly human forms of communication.
In
the past, we had to learn to act and think like the
machine. At the command line, we had to write in the
computer's language, not our own. The desktop graphical
user interface was a big step forward in making things
more humane through visuals, but it was still oriented
around how computers saw the world, not humans. When
you consider the additions of touch, speech, facial
expression, and physical gesture, you have nearly the
whole range of human (and humane) communication tools.
As computers learn the subtleties of those expressions,
our interfaces can become more human and more intuitive,
too.
Touchscreens are leading this charge for now, but touch
isn't appropriate in every context. Speech is obviously
great for the car, for walking, for any context where
you need your eyes elsewhere. We're going to see interfaces
that use these different modes of communication in
context-appropriate combinations. But that means we
have to start thinking hard about how our content works
in all these different contexts. So many are struggling
just to figure out how to make the content adapt to
a smaller screen. How about how your content sounds
when spoken? How about when it can be touched, or how
it should respond to physical gestures or facial expressions?
There's lots of work ahead.
And hey, if this stuff interests you and you happen to be in Austin for SXSW this week, I'm giving a talk about the future of touch-based interfaces on Friday. In particular, I'm focusing on how you can use gestures to create experiences that are more fun, more intuitive, and more efficient. At least as important, the talk will explore how to make gestures easy to discover, too:
Teaching Touch: Tapworthy Touchscreen Design
Friday, March 9, 3:30–4:30pm
Ballroom A, Austin Convention Center
Discover the rules of thumb for finger-friendly design. Touch gestures are sweeping away buttons, menus and windows from mobile devices—and even from the next version of Windows. Find out why those familiar desktop widgets are weak replacements for manipulating content directly, and learn to craft touchscreen interfaces that effortlessly teach users new gesture vocabularies.
The challenge: gestures are invisible, without the visual cues offered by buttons and menus. As your touchscreen app sheds buttons, how do people figure out how to use the damn thing? Learn to lead your audience by the hand (and fingers) with practical techniques that make invisible gestures obvious. Designer Josh Clark (author of O'Reilly books "Tapworthy" and "Best iPhone Apps") mines a variety of surprising sources for interface inspiration and design patterns. Along the way, discover the subtle power of animation, why you should be playing lots more video games, and why a toddler is your best beta tester.
See you there!