Recently the National Motor Museum launched an exhibition showcasing the vehicles and content of the Bush Mechanics television series. It is a travelling exhibition and is showing at the National Motor Museum in Birdwood, South Australia before touring around the country. One exciting element of this exhibition is an augmented reality (AR) project we undertook to bring one of the Bush Mechanics cars to life. This is our second foray as an organisation into augmented reality, the first was for the South Australian Maritime Museum exhibition The Art of Science.
The augmented reality app has been built to be triggered by images on one of the vehicles in the exhibition, a Ford Fairlane car, that featured in the Bush Mechanics series. The body of the car was painted by artist Jangala Rice. Each design represents something, for example the concentric rings on the bonnet represent fire, and the curved red design represents a windbreak (there are 8 symbolic references in total). The idea behind the app is to have an augmented experience pop up when locking onto each image. For example, fire pops up when a tablet or mobile device with the app installed is pointed at the the concentric rings and a windbreak appears when locking onto the curved red design. A number of challenges and limitations were identified during the scoping exercise for the initial planning of the app.
Thrown a curve ball
The biggest challenge turned out to be the fact that the car is not a flat surface and in fact is made up of curved surfaces on which the images had been painted. Initially this was not considered an issue, however on further investigation we discovered that Vuforia, the platform we used to build the app, worked best on planar (flat) surfaces. We also noted that the Vuforia library does have the capability to track 3D objects, however, the size is limited to about the size of a cereal box … not quite a full size vehicle! Additionally the 3D model has to be created using the Vuforia Object Scanner which is limited to a few exclusive mobile phone handsets. So we realised that using a 3D object as a target, as we had planned, was impractical.
Despite this, we decided to pursue the image targets even with the planar surface limitation (we like a challenge!). After considerable testing it was found that photographs of the car from above worked best as the AR camera was able to identify the point of reference quite effectively and display the 3D experience with correct orientation and minimal jitter. A top view photo was taken of the car from a cherry picker which was fun (see image above). The photos taken at an angle resulted in inconsistent orientation of the 3D experiences and were shaky due to inefficient target recognition. The tracking was far from optimal and a few compromises had to be made to create a rewarding experience.
A close call
Another limiting factor was the close proximity of the designs on the car. The 3D experiences had to somehow be coordinated so that only one would play at any given time. Moving the device away from a tracked target and then back onto a different tracked target did not work on our car, as all targets were visible in the view finder when users pointed the device at the car. We considered creating a prompt to click the screen to stop the playing AR experience and then to give the user an opportunity to find another target. Testing suggested there were a number of flaws in this approach including the fact that the individual targets were not being tracked optimally and that meant that the experience might end and the visitor could give up. We decided as a result there was really not much point in using separate targets for each experience.
Start your engines
The compromise was to use two of the better targets – one for the bonnet and one for the boot. These two targets were used to trigger the whole experience, depending on whether the viewer was at the front or the rear of the car. Once the target was being tracked, the first experience would appear, the visitor would then be prompted to click the screen to progress to the next experience. This streamlined the experience and made it less frustrating.
Another hurdle to overcome included the angle between viewing device and the car which was exaggerated by the distance from the car. In the museum setting the viewer had to be a certain distance from the car to prevent any damage. This issue would not have been a problem if the image was planar and was vertically mounted (mounting a car on a wall was not practical!). This made the targets even more difficult to detect, especially from multiple positions and angles. It took considerable testing to identify an optimum position and target that would work for the bonnet and the boot. The optimum positions were then marked on the floor so that the viewer had an idea where to stand to make sure that the target worked best.
At the finish line
So in the end there were several challenges to creating an app that worked effectively. The limiting factors included: the non-planar surface; the distance of the viewer from the car, to protect it from damage; and the close proximity of the designs on the car. The solution we came up with used the augmented reality functionality to trigger and position the experience as a whole. We also used image, text and audio content which describe each design individually to complement the interactive element. This information appears on the screen as the 3D experiences happen. This and the simplification of the use of the augmented reality triggers will hopefully enrich the visitor experience.
Image Target Requirements: https://library.vuforia.com/articles/Training/Image-Target-Guide
Vuforia Object Scanner: https://library.vuforia.com/articles/Training/Vuforia-Object-Scanner-Users-Guide