Author: Michael Marner

CADwalk brings Holodeck reality to walk-through design process

The Wearable Computer Lab partner with JumboVision to produce a 1:1 Spatial Augmented Reality design environment for Command and Control Centres.

A new patented technology that can plot walls or furniture from life-size building design plans is just one of the research projects that highlights how the University of South Australia (UniSA) is using its world-class expertise and capabilities to address industry needs through the University’s commercialisation arm ITEK Ventures.

Jumbo Vision International’s ingenious fusion of high-tech 2D and 3D visualisation and control room design skills, CADwalkTM allows users – such as operators, architects, builders and engineers – to walk their way through a full-sized representation of a control room. Modifications of the room layout are made possible by a special camera and movable pole receptor system created by Jumbo Vision International and UniSA which allows users to physically walk through their project at 1:1 scale.

The result is a life-sized digital playground that gives users a fully immersive visual sense of the room as it will appear when completed. Interactive software provides feedback and control over the layout in real time, with users simply having to physically move the tracking devices until they have achieved the desired output.

CADwalkTM has already been utilised in a live setting with promising results by its very first user worldwide, New Zealand’s national electricity power provider Transpower. Here, the virtual reality based system allowed users to engage with their recently completed major control room upgrade as the design took shape which resulted

in an estimated cut of three months from the design project cycle. To find more about research at the University of South Australia visit unisa.edu.au.

Here are some press releases with more information.

CADwalk brings Holodeck reality to walk through design process

UniSA ATSE Focus Research

AUIC 2012 Roundup

So the Australasian User Interface Conference for 2012 has been and gone. The Wearable Computer Lab presented two full papers and two posters, of which I was an author of one The papers we presented are listed below, and the publication page has been updated so you can get the PDFs. Cheers! E. T. A. […]

So the Australasian User Interface Conference for 2012 has been and gone. The Wearable Computer Lab presented two full papers and two posters, of which I was an author of one :)

The papers we presented are listed below, and the publication page has been updated so you can get the PDFs. Cheers!

E. T. A. Maas, M. R. Marner, R. T. Smith, and B. H. Thomas, “Supporting Freeform Modelling in Spatial Augmented Reality Environments with a New Deformable Material,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012. (pdf) (video)

T. M. Simon, R. T. Smith, B. H. Thomas, G. S. Von Itzstein, M. Smith, J. Park, and J. Park, “Merging Tangible Buttons and Spatial Augmented Reality to Support Ubiquitous Prototype Designs,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.

S. J. O’Malley, R. T. Smith, and B. H. Thomas, “Poster: Data Mining Office Behavioural Information from Simple Sensors,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.

T. M. Simon and R. T. Smith, “Poster: Magnetic Substrate for use with Tangible Spatial Augmented Reality in Rapid Prototyping Workflows,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.

Quimo. A deformable material to support freeform modelling in spatial augmented reality environments

Hello Everyone
3DUI has wrapped up for the year, so here is our second publication. We introduce a new material for freeform sculpting in spatial augmented reality environments. Please read the paper, and have a look at the video below.

 

Hello Everyone

3DUI has wrapped up for the year, so here is our second publication. We introduce a new material for freeform sculpting in spatial augmented reality environments. Please read the paper, and have a look at the video below.


 

New Publication: Adaptive Color Marker for SAR Environments

Hey Everyone So right now I am at the IEEE Symposium on 3D User Interfaces in Singapore. We have a couple of publications which I’ll be posting over the next few days. First up is Adaptive Color Marker for SAR Environments. In a previous study we created interactive virtual control panels by projecting onto otherwise […]

Hey Everyone

So right now I am at the IEEE Symposium on 3D User Interfaces in Singapore. We have a couple of publications which I’ll be posting over the next few days. First up is Adaptive Color Marker for SAR Environments. In a previous study we created interactive virtual control panels by projecting onto otherwise blank designs. We used a simple orange marker to track the position of the user’s finger. However, in a SAR environment, this approach suffers from several problems:

  • The tracking system can’t track the marker if we project the same colour as the marker.
  • Projecting onto the marker changes it’s appearance, causing tracking to fail.
  • Users could not tell when they were pressing virtual controls, because their finger occluded the projection.

We address these problems with an active colour marker. We use a colour sensor to detect what is being projected onto the marker, and change the colour of the marker to an opposite colour, so that tracking continues to work. In addition, we can use the active marker as a form of visual feedback. For example, we can change the colour to indicate a virtual button press.

I’ve added the publication to my publications page, and here’s the video of the marker in action.

 

Augmented Foam Sculpting for Capturing 3D Models

This weekend I presented my paper, Augmented Foam Sculpting for Capturing 3D Models, at the International symposium on 3D user interfaces. Since the conference has passed, I have added the video to youtube and the paper to my publications page. First, the video, then some discussion after the jump. Foam Sculpting The inspiration for this […]

This weekend I presented my paper, Augmented Foam Sculpting for Capturing 3D Models, at the International symposium on 3D user interfaces. Since the conference has passed, I have added the video to youtube and the paper to my publications page. First, the video, then some discussion after the jump.

Foam Sculpting

The inspiration for this work came out of a project we did with some industrial design students. Their job was to create some input devices for my SAR Airbrushing system. First up, we had a  meeting where I showed them a very early stages of development version of the system, to give them an idea of what we were doing. They went away and came up with ideas for input devices, and in the next meeting had a bunch of sketches ready. We discussed the sketches; what we liked and what we didn’t like. Next, they brought us foam mockups of some of the designs. We discussed these, and then eventually they came back with full CAD models ready for 3D printing. They did a great job by the way. But it got us thinking:

How can we make this process better?

Augmented Foam Sculpting is the result of this work. It allows a designer/artist to simultaneously create a physical design mockup and matching virtual model. This is a Good Thing™, because it utilises the skills and tools that designers are already using.

The system works by tracking the position and orientation of both the hot wire foam cutter, and the piece of foam the user is sculpting. We can track the motion of the hot wire as it passes through the foam. From there, we can create geometry that matches the cut path, and perform a Boolean difference operation on the foam geometry, to replicate the cut in the virtual model (Before any of you “Booleans are evil” people get to me, I’d like to point out I’m only dealing with, and creating, triangle meshes. There are no 11 sided polygons here).

Using projectors, we can add extra information to the foam as the user sculpts. We implemented 2 visualisations to aid designers when creating specific models. Cut Animation displays cuts to be made as animated lines on the foam surface. Once a cut has been made, the system moves to the next one. This visualisation could be used to recreate a previous object, or to instruct novices. An algorithm could be developed to calculate the actual cuts that need to be made, reducing the amount of planning needed when making an object.

The second visualisation, Target, projects a target model so that it appears to be inside the foam. The foam is coloured based on how much needs to be removed to match a target model. This could be used to create variations on a previous model.

Finally, we can use 3D procedural textures to change the appearance of the foam. For example, we implemented a wood grain 3D texture. This works pretty well, because as you cut away the foam, the texture updates to appear as though the wood was actually cut. 3D textures are also ideal because we don’t need to generate texture coordinates after each cut.

For all the details, please have a read of the paper. If you have any questions/comments/feedback/abuse, please comment on this post, or send me an email.

Michael