Category: Augmented Reality

Intel using WCL’s SAR System for Augmented Reality in Manufacturing

Intel has been using the WCL’s spatrial augmented reality (SAR) framework to explore it’s application to maintenance in manufacturing plants.

Summary:

“Intel’s pace of innovation is ferocious and our semiconductor manufacturing equipment is complex, so technicians rarely get a chance to stay experts for long before new technology changes the equipment. Much like any manufacturing business, our engineers are constantly improving the process and updating the maintenance procedures to reflect the latest information. Rapid adoption of these changes is critical. Intel IT is using projected Augmented Reality to bring these changes directly into the tools where technicians are doing the work. This reduces the training time required for a new procedure, allowing our experts to ramp faster. Watch this video to see how Intel IT is currently utilizing Projected Augmented Reality and plans for the future.”

Projected Augmented Reality: Keeping Pace with Innovation

Demonstration at Intel TechFest in Portland

CFP Workshop on Wearables for Outdoor Augmented Reality – ISWC 2012

held in conjunction with the International Symposium on Wearable Computers ISWC 2012

Overview

Augmented Reality (AR) is a successful application area of Wearable Computing, especially for Outdoor AR where mobility is an important factor. The most successful wearable computer to date is the mobile phones, which are also an excellent platform for Outdoor AR systems. Wearable Outdoor AR systems in general are widely utilized in various domains, including architecture, military, tourism, navigation, and entertainment. Such diverse usages impose several challenges on researchers from both areas of augmented reality and wearable computing, such as interaction, activity and context recognition, wearability, design, and modeling. We invite researchers from relevant disciplines to a one-day workshop held in conjunction with ISWC 2012 to present novel works and discuss the application of state-of-the-art wearable computing research to outdoor augmented reality systems. The workshop also provides an opportunity for directed discussion sessions to identify current issues, research topics, and solution approaches, which lead to the proposal of future research directions.

Considered within the context of Wearables for Outdoor Augmented Reality, the related research topics include, but are not restricted to:

–       wearable augmented reality applications and systems
–       interaction technique
–       input devices
–       outdoor tracking technologies
–       modeling, model reconstruction
–       wearability, wearable AR system design
–       activity and context recognition
–       use cases, usability study, and evaluations

The workshop aims to present an overview of latest research work in the area of Wearable Outdoor AR, and to provide an opportunity for the participants to have a focused discussion to identify future research directions. There is a great potential to apply the latest research in wearable computing, such as gesture recognition, wearable interaction, wearability design, and activity and context recognition, to outdoor AR research. The workshop is targeted at researchers from both research areas of wearable computing and outdoor AR, as well as any related application disciplines.

Important Dates

The workshop will be an opportunity to present position papers on topics relevant to Wearable Outdoor Augmented Reality. We invite submission of 2-4 page position papers using the IEEE Computer Science Press 8.5×11-inch two-column format (LaTeX and Microsoft Word templates), to be sent to Thuong.Hoang@unisa.edu.au. The position papers will be reviewed by the organizing committee and external reviewers. Authors will be notified of the results and will receive instructions for submitting the final version. The schedule for submission and notifications is as follows:

Submission due:                       13 April 2012
Notifications to authors:        23 April 2012
Final version due:                    30 April 2012

We are seeking papers that are DIRECTLY related to the area of Wearables for Outdoor Augmented Reality. The authors are required to describe explicitly how their technologies are applied to Wearables for Outdoor Augmented Reality.

Accepted papers will be included in the electronic adjunct proceedings distributed to participants of ISWC 2012. Authors presenting at the workshop are invited to co-author a journal paper outlining the ten future research challenges and/or future directions for Wearable Outdoor Augmented Reality research, based on the outcomes of the discussion held at the workshop. The paper is to be submitted to an appropriate journal.

Agenda

The workshop is divided into two parts:

–       Morning presentations:

  • Purpose: researchers present their works, directly related to Wearables for Outdoor Augmented Reality.
  • Format: presentations.
  • Topic: The topics/agenda for the presentations and discussions MUST be directly related to the area of Wearables for Outdoor Augmented Reality. They include, but are not limited to:
    • wearable augmented reality applications and systems
    • interaction techniques
    • input devices
    • outdoor tracking technologies
    • modeling and model reconstruction
    • wearability and wearable AR system design
    • activity and context recognition
    • use cases, usability study, and evaluations

–       Afternoon discussions:

  • Purpose: discussions about the current issues and future research challenges of Wearables for Outdoor Augmented Reality.
  • Format: directed discussions.
  • Outcome: outlining ten research challenges and/or future directions for Wearables for Outdoor Augmented Reality. Participants are invited to co-author a journal paper based on the outcomes of the discussions, to be submitted to an appropriate journal.

Organizers

Bruce H. Thomas & Thuong N. Hoang

Werable Computer Lab – University of South Australia

Bruce.Thomas@unisa.edu.auThuong.Hoang@unisa.edu.au

AUIC 2012 Roundup

So the Australasian User Interface Conference for 2012 has been and gone. The Wearable Computer Lab presented two full papers and two posters, of which I was an author of one The papers we presented are listed below, and the publication page has been updated so you can get the PDFs. Cheers! E. T. A. […]

So the Australasian User Interface Conference for 2012 has been and gone. The Wearable Computer Lab presented two full papers and two posters, of which I was an author of one :)

The papers we presented are listed below, and the publication page has been updated so you can get the PDFs. Cheers!

E. T. A. Maas, M. R. Marner, R. T. Smith, and B. H. Thomas, “Supporting Freeform Modelling in Spatial Augmented Reality Environments with a New Deformable Material,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012. (pdf) (video)

T. M. Simon, R. T. Smith, B. H. Thomas, G. S. Von Itzstein, M. Smith, J. Park, and J. Park, “Merging Tangible Buttons and Spatial Augmented Reality to Support Ubiquitous Prototype Designs,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.

S. J. O’Malley, R. T. Smith, and B. H. Thomas, “Poster: Data Mining Office Behavioural Information from Simple Sensors,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.

T. M. Simon and R. T. Smith, “Poster: Magnetic Substrate for use with Tangible Spatial Augmented Reality in Rapid Prototyping Workflows,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.

Quimo. A deformable material to support freeform modelling in spatial augmented reality environments

Hello Everyone
3DUI has wrapped up for the year, so here is our second publication. We introduce a new material for freeform sculpting in spatial augmented reality environments. Please read the paper, and have a look at the video below.

 

Hello Everyone

3DUI has wrapped up for the year, so here is our second publication. We introduce a new material for freeform sculpting in spatial augmented reality environments. Please read the paper, and have a look at the video below.


 

New Publication: Adaptive Color Marker for SAR Environments

Hey Everyone So right now I am at the IEEE Symposium on 3D User Interfaces in Singapore. We have a couple of publications which I’ll be posting over the next few days. First up is Adaptive Color Marker for SAR Environments. In a previous study we created interactive virtual control panels by projecting onto otherwise […]

Hey Everyone

So right now I am at the IEEE Symposium on 3D User Interfaces in Singapore. We have a couple of publications which I’ll be posting over the next few days. First up is Adaptive Color Marker for SAR Environments. In a previous study we created interactive virtual control panels by projecting onto otherwise blank designs. We used a simple orange marker to track the position of the user’s finger. However, in a SAR environment, this approach suffers from several problems:

  • The tracking system can’t track the marker if we project the same colour as the marker.
  • Projecting onto the marker changes it’s appearance, causing tracking to fail.
  • Users could not tell when they were pressing virtual controls, because their finger occluded the projection.

We address these problems with an active colour marker. We use a colour sensor to detect what is being projected onto the marker, and change the colour of the marker to an opposite colour, so that tracking continues to work. In addition, we can use the active marker as a form of visual feedback. For example, we can change the colour to indicate a virtual button press.

I’ve added the publication to my publications page, and here’s the video of the marker in action.

 

Augmented Foam Sculpting for Capturing 3D Models

This weekend I presented my paper, Augmented Foam Sculpting for Capturing 3D Models, at the International symposium on 3D user interfaces. Since the conference has passed, I have added the video to youtube and the paper to my publications page. First, the video, then some discussion after the jump. Foam Sculpting The inspiration for this […]

This weekend I presented my paper, Augmented Foam Sculpting for Capturing 3D Models, at the International symposium on 3D user interfaces. Since the conference has passed, I have added the video to youtube and the paper to my publications page. First, the video, then some discussion after the jump.

Foam Sculpting

The inspiration for this work came out of a project we did with some industrial design students. Their job was to create some input devices for my SAR Airbrushing system. First up, we had a  meeting where I showed them a very early stages of development version of the system, to give them an idea of what we were doing. They went away and came up with ideas for input devices, and in the next meeting had a bunch of sketches ready. We discussed the sketches; what we liked and what we didn’t like. Next, they brought us foam mockups of some of the designs. We discussed these, and then eventually they came back with full CAD models ready for 3D printing. They did a great job by the way. But it got us thinking:

How can we make this process better?

Augmented Foam Sculpting is the result of this work. It allows a designer/artist to simultaneously create a physical design mockup and matching virtual model. This is a Good Thing™, because it utilises the skills and tools that designers are already using.

The system works by tracking the position and orientation of both the hot wire foam cutter, and the piece of foam the user is sculpting. We can track the motion of the hot wire as it passes through the foam. From there, we can create geometry that matches the cut path, and perform a Boolean difference operation on the foam geometry, to replicate the cut in the virtual model (Before any of you “Booleans are evil” people get to me, I’d like to point out I’m only dealing with, and creating, triangle meshes. There are no 11 sided polygons here).

Using projectors, we can add extra information to the foam as the user sculpts. We implemented 2 visualisations to aid designers when creating specific models. Cut Animation displays cuts to be made as animated lines on the foam surface. Once a cut has been made, the system moves to the next one. This visualisation could be used to recreate a previous object, or to instruct novices. An algorithm could be developed to calculate the actual cuts that need to be made, reducing the amount of planning needed when making an object.

The second visualisation, Target, projects a target model so that it appears to be inside the foam. The foam is coloured based on how much needs to be removed to match a target model. This could be used to create variations on a previous model.

Finally, we can use 3D procedural textures to change the appearance of the foam. For example, we implemented a wood grain 3D texture. This works pretty well, because as you cut away the foam, the texture updates to appear as though the wood was actually cut. 3D textures are also ideal because we don’t need to generate texture coordinates after each cut.

For all the details, please have a read of the paper. If you have any questions/comments/feedback/abuse, please comment on this post, or send me an email.

Michael