The Wearable Computer Lab conducts world class research in a variety of related HCI fields, including augmented reality, wearable computing, information visualization, and human/computer interfaces. This page gives an overview of our work, for more information on a project please visit its dedicated project page.
Spatial Augmented Reality
Spatial Augmented Reality (SAR) is a branch of AR research utilising digital projectors as the display technology. The lab conducts research into novel interaction techniques and systems for SAR, presentation of information, and advanced rendering techniques.
- Interactive SAR
- Prototyping with SAR
- Advanced Rendering for SAR
- Projector-Based Augmented Reality for In-Situ Support for the Automotive Industry
- Procedural Augmented Reality Systems in the Fabrication Facilities
- Virtual Worlds Impacting Real Worlds
Information visualisation–the process of effectively communicating data visually–combines the disciplines of human perception theory, psychology, and computer science. The WCL actively works in the information visualization space, exploring both traditional displays and virtual world mediums.
For several years, the Wearable Computer Lab pioneered outdoor AR research. Our hardware platform, the Tinmith wearable computer features powerful graphics capabilities, integrated GPS and orientation sensors, and a head worn display. Using this system, we are able to conduct research into mobile and outdoor AR interaction techniques.
- In-situ Modelling in Outdoor AR
- Outdoor AR Menu Systems
- Corrosion Augmented Reality Visualisation
- Augmented viewport: Interaction at a distance
New Input Devices and Hardware
Our research into interaction techniques and HCI issues sometimes requires us to develop new hardware and input devices. These devices allow us to prototype and evaluate interaction techniques that would be impossible to achieve using standard computer peripherals such as the keyboard and mouse.
Active Tangible User Interfaces
In Tangible User Interfaces (TUI) digital information is mapped to physical artefacts such that direct manipulation of the corresponding artefacts result in manipulations to the digital information. Active Tangible User Interfaces (ATUI) extends this concept by expanding the functionality of the physical artefacts, allowing the artefacts to transform in order to reflect the changes to the mapped data, by external sources. In this project we aim to extend the paradigm even further by introducing concurrent data manipulations to a singular dataset via the use of distributed active tangible user interfaces.
Below is a list of projects that have been worked on in the past. These projects are not actively being developed, however may be continued at some point in the future.