Unifone

February 13, 2013

I just found Unifone by David Holman in proceedings of TEI'13 well presented and well structured.

Those are my notes:

objective

one-handed mobile interactions squeeze the phone with the hand that is holding it

how can you leverage squeezing as secondary input? (i.e. context menu)

related work

Karlson et al. 2008: user’s adopt (and prefer) one-handed strategies whenever possible Y. Guiard 1987 the kinematic chain as a model taylor et al. 2008 proposed Graspables: senses the structure of the grasp and infers the user’s most likely activity

Holman writes on page 2: “Graspables senses the structure of the grasp and infers the user’s most likely activity. Unifone, on the other hand, uses individual finger input instead of grasp-based sensing.”

without having read about graspables, I wonder, what describes a grasp other than what unifone is tracking

Rekimoto 1996: Tilting Operations for Small Screen Interfaces, they also navigated menus, browsed large maps, and selected targets in pie menus

Hinckley and Baudisch also have some work about offscreen interaction with mobile devices

Guiard’s kinematic chain

left hand provides a reference and the right hand performs actions

auxiliary touch gestures

squeezing top
squeezing the middle
squeezing the bottom

coarse input

you are not squeezing a particular spot, more a whole area

tasks & results

tasks that required displaced movement of the thumb performed better with uniform.

direct scrolling

28% slower with unifone
squeeze top to scroll top
squeeze bottom to scroll down

halted scrolling

17% slower with unifone
scroll through large lists
squeeze top to slow down
squeeze bottom to stop

map navigation

12.5% faster with unifone
squeeze topright for zooming

application switching

9.8% faster with unifone
squeeze to bring up multitask menu
swipe to navigate
let go to select

formatting with context menu

25% faster with unifone
select with thumb
squeeze to reveal context menu

device

has two (!) pressure sensors
two metal bars distribute the pressure beyond the actuated point
Holman on page 4: “Without this force-distributed design, it is difficult to achieve ergonomic comfort across a range of users.”

this is just an assumption, right?

experimental design

10 right-handed participants (to only need one device)
5 tasks each 5 times per condition

two independent variables

input method: thumb or thumb+unifone target distance

dependent measure: completion time

results

auxiliary input is supportive

“the remaining fingers should act as secondary controls that extend the thumb’s behavior” (page 7)

auxiliary input is coarse

“Given the limited range of motion of the auxiliary fingers, designing for coarse interaction is preferred and supporting input zones—instead of exact target locations—is critical” (page 7)

auxiliary input is quasimode

“Thus, the complexity of modal operations can be avoided by treating pressure as a simple quasimodal input state. If multiple levels are used, as it is in the halted scrolling task, it should be mapped across a continuum of the same dimension.” (page 8)

couple auxiliary input

“Therefore, quasimodal gestures should be brief, should pay close attention to their relationship with the thumb, and be carefully designed to conform to the user’s mental model and expectation of physical motion as they grasp and manipulate a device” (page 8)

reference

Holman, D., Banerjee A., Hollatz A, Vertegaal R. Unifone: Designing for Auxiliary Finger Input in One-Handed Mobile Interactions. In Proc of TEI2013

Discussion, links, and tweets

I care about lots other interesting things as well. Follow me on Twitter to get an impression of that. Or contact me directly on another channel or come along to visit me.