Publication | ACM Symposium on User Interface Software & Technology 2012
Waken
Reverse Engineering Usage Information and Interface Structure from Software Videos
Abstract
Waken: Reverse Engineering Usage Information and Interface Structure from Software Videos
Nikola Banovic, Tovi Grossman, Justin Matejka, George Fitzmaurice
ACM Symposium on User Interface Software & Technology 2012
We explore the possibilities and opportunities related to reverse engineering usage information from screen-captured application video tutorials. We develop an application-independent system that recognizes UI components and activities, such as cursor movements and icon clicks, from an input set of video tutorials, We then present Waken, an enhanced video player, which showcases some of the design opportunities the are introduced by having this additional meta-data. In particular, users can directly interact with UI components in the video, such as icons and menus, to display associated information, or navigate to relevant moments in other videos. Initial results suggest that the system can successfully reconstruct many aspects of a UI without any prior application-dependent knowledge.
Download publicationAssociated Researchers
Nikola Banovic
Dept. of Computer Science University of Toronto, Ontario, Canada
Related Resources
2024
Elicitron: An LLM Agent-Based Simulation Framework for Design Requirements ElicitationA novel framework that leverages Large Language Models (LLMs) to…
2024
TimeTunnel: Integrating Spatial and Temporal Motion Editing for Character Animation in Virtual RealityThis research provides an approachable editing experience by…
2016
Interactive Instruction in Bayesian InferenceAn instructional approach is presented to improve human performance in…
2013
A Multi-Site Field Study of Crowdsourced Contextual Help: Usage and Perspectives of End-Users and Software TeamsWe present a multi-site field study to evaluate LemonAid, a…
Get in touch
Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.
Contact us