I would like to welcome Dr Denis Kalkofen to Australian Research Centre for Interactive and Virtual Environments (ARCIVE). He will be visiting us until March 1st.
The Narrative Visualisation project has won a merit in the Australian Information Industry Association’s (AIIA) national awards for top Research and Development project in 2018!.
The Narrative Visualisation project has won the Australian Information Industry Association’s (AIIA) award for top Research and Development project for South Australia in 2018!.
A global defence company’s latest collaboration is using the Microsoft HoloLens platform and augmented reality to help civilians understand the role of field hospitals in natural disasters
The exhibit is on display at UniSA’s hi-tech Museum of Discovery (MOD) where visitors don HoloLens goggles to take a tour of virtual field hospital and experience real emergency scenarios.
The WCL will be working with Jumbo Vision International on a project titled “Visualization Tools for the Design of Manufactured High End Instrumented Facilities”.
Currently the design of manufactured high end instrumented facilities (such as command centres and control panels) is one of working almost entirely in the virtual world. The physical space and layout of such systems demands high level 3D spatial visualizations from the stakeholders. Instead of visualizing a command centre with virtual reality tools or expensive physical prototypes, this project will explore white painted lightweight wooden objects that would be built to the external dimensions of the major components of the centre and the details of the workstation will be projected onto them via large scale augmented reality.
The current process of decision making is time consuming. A major effort is the externalisation of the clients’ needs and requirements. Normal practices require a large number design meetings iterating over concepts that are present as either engineering drawings or 3D static renderings. The use of animations with fly-throughs and guided tours allow for a more immersive experience, but the clients lack the tools to manipulate the concept themselves.
This project wishes to investigate a set of novel tools that allows design teams to manage a process of the clients to manipulate the design concepts. To do this, we will place the clients in physical environment that emulates the final high end instrumented facility. The end users will be able to view the command centre from any vantage point by merely walking. The configuration of workstations or controls on the panels can be modified by physical moving the physical prototypes or manipulating the virtual information projected onto the prototypes.
To make these tools useful for the manufacturer, this new design methodology must be embedded in the company’s current design process. Issues of data transfer, operation semantics, workflows, and process planning will have to be addressed.
Working in close collaboration with Health Sciences Dr Ross Smith developed a virtual reality simulation system to support research into chronic neck pain therapies. The systems adopts a well know technique, amongst Virtual Reality researchers, called “re-directed walking” which alters user perception between the physical and virtual worlds. Re-directed walking applies a slight correction to the participant’s orientation so in the physical world their movement is different from the virtual environment. With this users in the virtual environment can perceive they are walk along an infinantely straight line although in the physcial world they are actually walking in circles.
The re-directed walking method was adopted to alter the head movements of 24 participants suffering chronic neck pain. Participants wore an Oculus Rift Head Mounted Display and were asked to rotate their head left and right until their first onset of pain. The system changed participant’s perception so as the actual movement could be more, less or the same as what is perceived in the virtual world. The study showed that participants had more pain free head rotation when the system reduced appearance from the actual head rotation allowing 6% improvement in head mobility. The initial findings are positive that such techniques might be further developed and used in future therapies that employ virtual reality systems. The full article was published in the Journal of Psychological Science.
Pain is a perceptual response, one that researchers are finding is influenced by contextual, psychological, and sensory factors. In a study of the influence of visual feedback on pain, participants with neck pain rotated their heads while receiving different types of visual feedback through a virtual reality headset. The visual feedback gave the illusion that participants had turned their heads more, less, or to a degree equal to the actual physical rotation. Participants had a larger pain-free range of motion when they received understated visual feedback and a smaller pain-free range of motion when they received overstated visual feedback. The authors posit that, over time, sensory factors associated with pain may turn into triggers for the pain itself.
Daniel S. Harvie, Markus Broecker, Ross T. Smith, Ann Meulders, Victoria J. Madden, and G. Lorimer Moseley
University of South Australia researchers are using their augmented reality expertise to progress ground-breaking bionic eye research in Australia.
The Wearable Computer Lab at UniSA has created a backpack wearable computer kit that will be used for vision simulation studies being undertaken by the Vision Processing team at National Information Communications Technology Australia (NICTA) and Bionic Vision Australia.
“Using our system, Bionic Vision Australia will run studies allowing anyone to see as close as possible what someone with a bionic eye would be seeing,” said Dr Ross Smith, Co-Director of the Wearable Computer Lab.
Dr Smith presented the system with PhD candidate Thuong Hoang to NICTA and Bionic Vision Australia last month. The wearable kit has received a very positive response with several more systems ordered to be built for research with sighted participants.
“This is the second backpack design we have developed for NICTA,” Dr Smith said.
“The new version was inspired with recent miniaturization of electronics that allowed us to build a more usable, lightweight, reliable and effective solution.
“The new backpack provides more processing power, has a reduction in weight and size, advanced battery technologies and is a more robust design to support trials.
“The staff were very impressed with the new solution.”
Nick Barnes, Senior Principal Researcher, NICTA, and Vision Processing Leader, Bionic Vision Australia, is looking forward to using the new system for simulated prosthetic vision trials.
“We are very pleased with the new solution,” Barnes said. “It is lighter weight, more reliable, significantly more comfortable for our volunteer participants, and has extra features that will broaden the types of trial we can undertake.”
Dr Smith said the Wearable Computer Lab team now has significant prototyping capabilities and is excited to use its capability to support research that aims to restore vision to those who have lost it.
Bionic Vision Australia is leading the work on the bionic eye, currently working on three bionic eye devices. The bionic eye system would consist of a small digital camera, external processor and an implant with a microchip and stimulating electrodes surgically placed in the back of the eye. It will be used to restore a sense of vision to people with retinitis pigmentosa and age-related macular degeneration.
UniSA’s Wearable Computer Lab has been collaborating with NICTA on this project for a few years now, and following the positive response to the new backpack wearable computer kit, it has planned future directions for the system that will allow it to continue the collaboration.
The Wearable Computer Lab partner with JumboVision to produce a 1:1 Spatial Augmented Reality design environment for Command and Control Centres.
A new patented technology that can plot walls or furniture from life-size building design plans is just one of the research projects that highlights how the University of South Australia (UniSA) is using its world-class expertise and capabilities to address industry needs through the University’s commercialisation arm ITEK Ventures.
Jumbo Vision International’s ingenious fusion of high-tech 2D and 3D visualisation and control room design skills, CADwalkTM allows users – such as operators, architects, builders and engineers – to walk their way through a full-sized representation of a control room. Modifications of the room layout are made possible by a special camera and movable pole receptor system created by Jumbo Vision International and UniSA which allows users to physically walk through their project at 1:1 scale.
The result is a life-sized digital playground that gives users a fully immersive visual sense of the room as it will appear when completed. Interactive software provides feedback and control over the layout in real time, with users simply having to physically move the tracking devices until they have achieved the desired output.
CADwalkTM has already been utilised in a live setting with promising results by its very first user worldwide, New Zealand’s national electricity power provider Transpower. Here, the virtual reality based system allowed users to engage with their recently completed major control room upgrade as the design took shape which resulted
in an estimated cut of three months from the design project cycle. To find more about research at the University of South Australia visit unisa.edu.au.
Here are some press releases with more information.
GM Holden at Elizabeth, South Australia has been using the WCL’s spatial augmented reality (SAR) projection system to explore it’s application for spot welding on the production line.
For further information, please refer to their online blog.
Intel has been using the WCL’s spatrial augmented reality (SAR) framework to explore it’s application to maintenance in manufacturing plants.
“Intel’s pace of innovation is ferocious and our semiconductor manufacturing equipment is complex, so technicians rarely get a chance to stay experts for long before new technology changes the equipment. Much like any manufacturing business, our engineers are constantly improving the process and updating the maintenance procedures to reflect the latest information. Rapid adoption of these changes is critical. Intel IT is using projected Augmented Reality to bring these changes directly into the tools where technicians are doing the work. This reduces the training time required for a new procedure, allowing our experts to ramp faster. Watch this video to see how Intel IT is currently utilizing Projected Augmented Reality and plans for the future.”