Augmented Reality App – The Art of Science Exhibition

Nicolas Baudin’s ships, Géographe and Naturaliste embarked from Le Havre in October 1800 for the southern continent carrying an impressive contingent of scientists and scientific assistants. Lavishly funded by Napoleon Bonaparte, the expedition’s agenda was the discovery and study of natural sciences. The ships returned to France with over 100,000 specimens and a collection of 15,00 natural history drawings.

The exhibition ‘The Art of Science: Baudin’s Voyagers 1800-1804’ at the South Australian Maritime Museum showcases original sketches and paintings from the Museum of Natural History in Le Havre, France to South Australian audiences for the first time.

The app developed for this exhibition brings to life four of the paintings created during Baudin’s voyage. The four images trigger separate augmented reality experiences. Its development was a collaboration between local designer Peter Bushell and myself.

Download the app and use the target images at the end of this post to try it out:

Software

The software we used to create the app included:

Creating the 3D Models

An image of each of the artworks  was used as a base to create the 3D models (or a 3D mesh) from scratch in Blender. Primitive objects (like cylinders, squares, ellipses) were combined and moulded to form a 3D mesh that closely resembled the artwork. Parts of the artwork were then used to create a texture map. In Blender, a texture map is a flat image that has been unwrapped from the 3D model and then modified in an image editing software (like Photoshop) to create a texture that can then be imported and applied to the the 3D model.

3D mesh with no texture

3D mesh with no texture

Applying the a Texture

Applying the a Texture

3D mesh with texture

3D mesh with texture

The next step in creating the 3D models was to give them the ability to move. To do this an armature was created. In simple terms an armature is a skeleton made up of separate bones that control parts of the 3D model just as our bones and joints define our range of movement. There were a few steps to ensure that each bone only moved what it was supposed to move. In Blender this process is referred to as weight painting and is a process of painting a sort of heat map that controls the influence a bone has over the 3D mesh.

3D Model with Armature

3D Model with Armature

Weight painting the mesh to work with the armature.

Weight painting the mesh to work with the armature.

The augmented reality (AR) experience was created using the Unity game engine. Importantly the Vuforia plugin is required to make all the magic happen. It is simply a matter of downloading the Vuforia package  for Unity (https://developer.vuforia.com/downloads/sdk) and double clicking. This adds the Vuforia package to the Unity project.

A Unity game has scenes and each scene has a camera. The default camera allows the viewer (or game player) to see different parts of the game environment and can move around as if it were a movie being displayed with different angle shots, panning etc. The Vuforia package provides an augmented reality camera that is different in that it uses the devices camera combined with virtual objects (3D models) as the viewed environment.

In our case we needed one scene with one augmented reality camera that displayed each individual experience (animated 3D model) depending on which artwork was being looked at. Each animated experience was triggered by a target. A target is an image that is used to trigger the AR experience and in our case we wanted to use the actual artworks as targets. The key to a good target is one that has a unique pattern that can be identifiable as a trigger (similar to a QR code that links to a url). There were some issues in refining our target images to work well, particularly the jellyfish and the seal which don’t have clearly identifiable markings like the spots on the puffer fish or the geometric spacing of the frogs. The photographs we were working from were not very clear. Fortunately, this turned out to be the key to getting Vuforia to recognise patterns in the artwork. Taking some clearer  photographs of the original artworks meant I was able to get the seal and the jellyfish targets to work to an acceptable level.

Vuforia’s online tool allows  the targets to be added to a package that can be downloaded. Clicking this downloaded package adds a target database to Unity that can be used to create target objects that trigger the AR experiences.

Note: The targets are dependant on colour and rather identify notable characteristics in the image.

Use the target manager in the Vuforia portal to add and rate targets.

Use the target manager in the Vuforia portal to add and rate targets.

The Vuforia package allows multiple targets to be attached to a scene. It was a matter of creating four of these target objects by dragging a Vuforia target prefab into the scene. A prefab is a pre-constructed object that can be customised by changing parameter. Each of the target objects was associated with a target image provided in the downloaded target database.

Additionally the targets needed to trigger a 3D model.

Animating the 3D Models

In order to create the animated 3D models I first had to load all the objects into Unity and associate each with it’s target. Unity has built in support for Blender, so it was simply a matter of copying all of the blender files for each 3D model into the Unity assets folder and then in Unity dragging the prefab that is created (by Unity) onto it’s respective target object. At this stage, playing the scene and putting each target in front of the camera would cause a static 3D model to appear in space.

3D Model and Vuforia Target in Unity

3D Model and Vuforia Target in Unity

The final step was to animate each of the objects. This was done by creating key frames for each animation. Key frames are points in time that display the object in a different position or pose. These key frames all add up to make one continuous animation sequence.

The artwork comes to life

The result is an effective augmented reality experience that is coupled with The Art of Science travelling exhibition (presented by the South Australian Maritime Museum).

Puffer fish swimming out of artwork.

Puffer fish swimming out of artwork.

The tools to create this are cutting edge and easy to use. Very little code was programmed to create the app which makes the process accessible to anyone even if you are not a programmer. Another benefit to this kind of app is that it does not need to be restricted to targets that are in one location like the artworks in the exhibition. The targets can be duplicated on marketing material or even online as in the case of the fish above which appears online and on all of the exhibition’s promotional material.

Download the app and try using the target images below:

 

This slideshow requires JavaScript.

2 Comments

  • Tara Lee says:

    This is amazing Oliver! Would there be the ability to add sound when the trigger image was activated?

    • Oliver Scholey says:

      Hi Tara

      Thanks and yes, targets can trigger sound.

      Attach an audio source to your image target and then attach the code snippet below. This will play the audio on detection and stop audio when tracking is lost. Good luck and let me know if you need any further info.


      using System.Collections;
      using System.Collections.Generic;
      using UnityEngine;
      using Vuforia;

      public class TriggerAudio : MonoBehaviour, ITrackableEventHandler {

      private TrackableBehaviour mTrackableBehaviour;

      void Start()
      {
      mTrackableBehaviour = GetComponent();
      if (mTrackableBehaviour)
      {
      mTrackableBehaviour.RegisterTrackableEventHandler(this);
      }
      }

      public void OnTrackableStateChanged(
      TrackableBehaviour.Status previousStatus,
      TrackableBehaviour.Status newStatus)
      {
      if (newStatus == TrackableBehaviour.Status.DETECTED ||
      newStatus == TrackableBehaviour.Status.TRACKED ||
      newStatus == TrackableBehaviour.Status.EXTENDED_TRACKED)
      {
      // Play audio when target is found
      this.GetComponent().Play();
      }
      else
      {
      // Stop audio when target is lost
      this.GetComponent
      ().Stop();
      }
      }
      }

Leave a Reply