Results 1 - 10
of
146
Toolkit Design for Interactive Structured Graphics
- IEEE Transactions on Software Engineering
, 2004
"... In this paper, we analyze toolkit designs for building graphical applications with rich user interfaces, comparing polylithic and monolithic toolkit-based solutions. Polylithic toolkits encourage extension by composition and follow a design philosophy similar to 3D scene graphs supported by toolkits ..."
Abstract
-
Cited by 215 (9 self)
- Add to MetaCart
(Show Context)
In this paper, we analyze toolkit designs for building graphical applications with rich user interfaces, comparing polylithic and monolithic toolkit-based solutions. Polylithic toolkits encourage extension by composition and follow a design philosophy similar to 3D scene graphs supported by toolkits including Java3D and OpenInventor. Monolithic toolkits, on the other hand, encourage extension by inheritance, and are more akin to 2D Graphical User Interface toolkits such as Swing or MFC. We describe Jazz (a polylithic toolkit) and Piccolo (a monolithic toolkit), each of which we built to support interactive 2D structured graphics applications in general, and Zoomable User Interface applications in particular. We examine the trade offs of each approach in terms of performance, memory requirements, and programmability. We conclude that a polylithic approach is most suitable for toolkit builders, visual design software where code is automatically generated, and application builders where there is much customization of the toolkit.
Gestures without Libraries, Toolkits or Training: A $1 Recognizer for User Interface Prototypes
"... Although mobile, tablet, large display, and tabletop computers increasingly present opportunities for using pen, finger, and wand gestures in user interfaces, implementing gesture recognition largely has been the privilege of pattern matching experts, not user interface prototypers. Although some us ..."
Abstract
-
Cited by 206 (16 self)
- Add to MetaCart
(Show Context)
Although mobile, tablet, large display, and tabletop computers increasingly present opportunities for using pen, finger, and wand gestures in user interfaces, implementing gesture recognition largely has been the privilege of pattern matching experts, not user interface prototypers. Although some user interface libraries and toolkits offer gesture recognizers, such infrastructure is often unavailable in design-oriented environments like Flash, scripting environments like JavaScript, or brand new off-desktop prototyping environments. To enable novice programmers to incorporate gestures into their UI prototypes, we present a “$1 recognizer ” that is easy, cheap, and usable almost anywhere in about 100 lines of code. In a study comparing our $1 recognizer, Dynamic Time Warping, and the Rubine classifier on user-supplied gestures, we found that $1 obtains over 97 % accuracy with only 1 loaded template and 99 % accuracy with 3+ loaded templates. These results were nearly identical to DTW and superior to Rubine. In addition, we found that medium-speed gestures, in which users balanced speed and accuracy, were recognized better than slow or fast gestures for all three recognizers. We also discuss the effect that the number of templates or training examples has on recognition, the score falloff along recognizers ’ N-best lists, and results for individual gestures. We include detailed pseudocode of the $1 recognizer to aid development, inspection, extension, and testing. ACM Categories & Subject Descriptors: H5.2. [Information interfaces and presentation]: User interfaces – Input devices and strategies. I5.2. [Pattern recognition]: Design methodology – Classifier design and evaluation. I5.5. [Pattern recognition]: Implementation – Interactive systems.
Jazz: An Extensible Zoomable User Interface Graphics Toolkit in Java
, 2000
"... In this paper we investigate the use of scene graphs as a general approach for implementing two-dimensional (2D) graphical applications, and in particular Zoomable User Interfaces (ZUIs). Scene graphs are typically found in three-dimensional (3D) graphics packages such as Sun's Java3D and SGI&a ..."
Abstract
-
Cited by 183 (39 self)
- Add to MetaCart
(Show Context)
In this paper we investigate the use of scene graphs as a general approach for implementing two-dimensional (2D) graphical applications, and in particular Zoomable User Interfaces (ZUIs). Scene graphs are typically found in three-dimensional (3D) graphics packages such as Sun's Java3D and SGI's OpenInventor. They have not been widely adopted by 2D graphical user interface toolkits. To explore the effectiveness of scene graph techniques, we have developed Jazz, a general-purpose 2D scene graph toolkit. Jazz is implemented in Java using Java2D, and runs on all platforms that support Java 2. This paper describes Jazz and the lessons we learned using Jazz for ZUIs. It also discusses how 2D scene graphs can be applied to other application areas. Keywords Zoomable User Interfaces (ZUIs), Animation, Graphics, User Interface Management Systems (UIMS), Pad++, Jazz. INTRODUCTION Today's Graphical User Interface (GUI) toolkits contain a wide range of built-in user interface objects (also kno...
ConnecTables: Dynamic Coupling of Displays for the Flexible Creation of Shared Workspaces
, 2001
"... We present the ConnecTable, a new mobile, networked and context-aware information appliance that provides affordances for pen-based individual and cooperative work as well as for the seamless transition between the two. In order to dynamically enlarge an interaction area for the purpose of shared us ..."
Abstract
-
Cited by 93 (4 self)
- Add to MetaCart
We present the ConnecTable, a new mobile, networked and context-aware information appliance that provides affordances for pen-based individual and cooperative work as well as for the seamless transition between the two. In order to dynamically enlarge an interaction area for the purpose of shared use, a flexible coupling of displays has been realized that overcomes the restrictions of display sizes and borders. Two ConnecTable displays dynamically form a homogeneous display area when moved close to each other. The appropriate triggering signal comes from built-in sensors allowing users to temporally combine their individual displays to a larger shared one by a simple physical movement in space. Connected ConnecTables allow their users to work in parallel on an ad-hoc created shared workspace as well as exchanging information by simply shuffling objects from one display to the other. We discuss the user interface and related issues as well as the software architecture. We also present the physical realization of the ConnecTables.
Past, Present and Future of User Interface Software Tools
- ACM TRANSACTIONS ON COMPUTER-HUMAN INTERACTION
, 2000
"... A user interface software tool helps developers design and implement the user interface. Research on past tools has had enormous impact on today's developers---virtually all applications today were built using some form of user interface tool. In this paper, we consider cases of both success an ..."
Abstract
-
Cited by 86 (3 self)
- Add to MetaCart
A user interface software tool helps developers design and implement the user interface. Research on past tools has had enormous impact on today's developers---virtually all applications today were built using some form of user interface tool. In this paper, we consider cases of both success and failure in past user interface tools. From these cases we extract a set of themes which can serve as lessons for future work. Using these themes, past tools can be characterized by what aspects of the user interface they addressed, their threshold and ceiling, what path of least resistance they offer, how predictable they are to use, and whether they addressed a target that became irrelevant. We believe the lessons of these past themes are particularly important now, because increasingly rapid technological changes are likely to significantly change user interfaces. We are at the dawn of an era where user interfaces are about to break out of the "desktop" box where they have been stuck for the ...
User Interface Declarative Models and Development Environments: A Survey
- Proceedings of DSV-IS2000, volume 1946 of LNCS
, 2000
"... presentation model APM Provides a conceptual description of the structure and behaviour of the visual parts of the user interface. There the UI is described in terms abstract objects. ..."
Abstract
-
Cited by 78 (3 self)
- Add to MetaCart
(Show Context)
presentation model APM Provides a conceptual description of the structure and behaviour of the visual parts of the user interface. There the UI is described in terms abstract objects.
Inconsistency Management for Multiple-View Software Development Environments
- IEEE Transactions on Software Engineering
, 1998
"... Abstract—Developers need tool support to help manage the wide range of inconsistencies that occur during software development. Such tools need to provide developers with ways to define, detect, record, present, interact with, monitor and resolve complex inconsistencies between different views of sof ..."
Abstract
-
Cited by 65 (12 self)
- Add to MetaCart
(Show Context)
Abstract—Developers need tool support to help manage the wide range of inconsistencies that occur during software development. Such tools need to provide developers with ways to define, detect, record, present, interact with, monitor and resolve complex inconsistencies between different views of software artifacts, different developers and different phases of software development. This paper describes our experience with building complex multiple-view software development tools that support diverse inconsistency management facilities. We describe software architectures we have developed, user interface techniques used in our multiple-view development tools, and discuss the effectiveness of our approaches compared to other architectural and HCI techniques. Index Terms—Inconsistency management, multiple views, integrated software development environments, collaborative software development. 1
Implications For a Gesture Design Tool
- In Human Factors in Computing Systems (SIGCHI Proceedings). ACM, ACM
, 1999
"... Interest in pen-based user interfaces is growing rapidly. One potentially useful feature of pen-based user interfaces is gestures, that is, a mark or stroke that causes a command to execute. Unfortunately, it is difficult to design gestures that are easy 1) for computers to recognize and 2) for huma ..."
Abstract
-
Cited by 60 (5 self)
- Add to MetaCart
Interest in pen-based user interfaces is growing rapidly. One potentially useful feature of pen-based user interfaces is gestures, that is, a mark or stroke that causes a command to execute. Unfortunately, it is difficult to design gestures that are easy 1) for computers to recognize and 2) for humans to learn and remember. To investigate these problems, we built a prototype tool for designing gesture sets. An experiment was then performed to gain insight into the gesture design process and to evaluate the tool. The experiment confirmed that gesture design is very difficult and suggested several ways in which current tools can be improved. The most important of these improvements is to make the tools more active and provide more guidance for designers. This paper describes the gesture design tool, the experiment, and its results. Keywords pen-based user interface, PDA, user study, gesture, UI design INTRODUCTION This work explores the process of gesture design with the goal of impr...