Publication
Graspable User Interfaces
AbstractThis dissertation defines and explores Graspable User Interfaces, an evolution of the input mechanisms used in graphical user interfaces (GUIs). A Graspable UI design provides users concurrent access to multiple, specialized input devices which can serve as dedicated physical interface widgets, affording physical manipulation and spatial arrangements. Like conventional GUIs, physical devices function as “handles” or manual controllers for logical functions on widgets in the interface. However, the notion of the Graspable UI builds on current practice in a number of ways. With conventional GUIs, there is typically only one graphical input device, such as a mouse. Hence, the physical handle is necessarily “time-multiplexed,” being repeatedly attached and unattached to the various logical functions of the GUI. A significant aspect of the Graspable UI is that there can be more than one input device. Hence input control can then be “space-multiplexed.” That is, different devices can be attached to different functions, each independently (but possibly simultaneously) accessible. This, then affords the capability to take advantage of the shape, size and position of the physical controller to increase functionality and decrease complexity. It also means that the potential persistence of attachment of a device to a function can be increased. By using physical objects, we not only allow users to employ a larger expressive range of gestures and grasping behaviors but also to leverage off of a user’s innate spatial reasoning skills and everyday knowledge of object manipulations.In this thesis the concept of Graspable user interfaces is defined. Support for the concept is provided from the psychological literature. Instantiations of the concept are found in existing user interfaces. A task analysis of an existing interface’s input activities and how to convert these to Graspable user interface devices is presented. The possible uses and implementation difficulties of bricks, a specific Graspable user interface are investigated. Finally, the advantages of two of the Graspable UI properties over conventional time-multiplexed generic input devices is measured by two controlled experiments.
Download publicationRelated Resources
See what’s new.
2010
MouseLight: Bimanual interactions on digital paper using a pen and a spatially-aware mobile projectorMouseLight is a standalone mobile projector with the form factor of a…
2011
From Games and Films to Molecular Simulation and DesignThis paper describes our experience at Autodesk Research in…
2019
Physics-based simulation ontology: an ontology to support modelling and reuse of data for physics-based simulationThe current work presents an ontology developed for physics-based…
2011
Sensor-enabled Cubicles for Occupant-centric Capture of Building Performance DataBuilding performance discourse has traditionally focused on the…
Get in touch
Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.
Contact us