Publication
Typing on Glasses: Adapting Text Entry to Smart Eyewear
Abstract
Text entry for smart eyewear is generally limited to speech-based input due to constraints of the input channels. However, many smart eyewear devices are now including a side touchpad making gesture-based text entry feasible. The Swipeboard technique, recently proposed for ultra-small touch screens such as smart watches, may be particularly suitable for smart eyewear: unlike other recent text-entry techniques for small devices, it supports eyes-free input. We investigate the limitations and feasibility of implementing Swipeboard on smart eyewear, using the side touch pad for input. Our first study reveals usability and recognition problems of using the side touch pad to perform the required gestures. To address these problems, we propose SwipeZone, which replaces diagonal gestures with zone-specific swipes. In a text entry study, we show that our redesign achieved a WPM rate of 8.73, 15.2% higher than Swipeboard, with a statistically significant improvement in the last half of the study blocks.
Download publicationRelated Resources
See what’s new.
2024
Adaptive Robotic Construction of Wood FramesThis work presents a multi-stage, multi-scale perception strategy for…
2023
3DALL-E: Integrating Text-to-Image AI in 3D Design Workflows3DALL-E integrated three large AI models within Fusion 360 to explore…
2021
Robust Representation Learning via Perceptual Similarity MetricsA fundamental challenge in artificial intelligence is learning useful…
2019
Command Usage Arc DiagramsExploring and analyzing a database of over 60 million commands issued…
Get in touch
Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.
Contact us