Hands-On Review: Vuforia SDK 5
Yesterday, I had the chance to play with the recently released Vuforia SDK 5. Previously, I’ve written about it and excited by its promise. This SDK has its own set of AR and VR features, which intrigued me. So, I thought, why not dive in head first and add another tool to my arsenal, who knows some project might benefit from it?
The SDK comes in 3 flavors, native iOS, native Android and Unity Plugin. Since I’m pretty much working with Unity these days, I decided to pick the latter. As with my previous encounter with this SDK, installing and using it inside Unity is a breeze. Basically the SDK gives me several prefabs that I can use straightaway. But its biggest selling point is the image/object recognizer & tracker. With this, I can track whatever image/marker that I want and augmented it with any object. This is AR, Vuforia’s bread and butter experience.
What’s really new in this SDK version, is the ability to pair AR Camera (this is the camera that does image recognizing and tracking) with additional 3rd party camera provided by HMD such as Google Cardboard or Oculus Mobile VR. This enables the application to be rendered in stereo mode, thus enabling VR mode for your application. So, the mobile device is used for two things:
- image tracking through its camera
- stereo rendering on its screen
Using this for a project in Unity itself is pretty easy, though might be complicated for beginner (the manual is pretty much scattered in Vuforia’s site at the moment). In case you need a help, here’s what I did for integrating it with Google Cardboard SDK:
- Download Vuforia SDK and Google Cardboard SDK (version 0.5 upward !important)
- Create a License for the app. I use mobile license, since I want to make app for Google Cardboard
- Make a new project, or use the samples provided
- Export both SDKs Unity Package
- Add the CardboardMain prefab to the project
- Set its place and Attach it to the ARCamera prefab (add the ARCamera first)
- Add the license code to the ARCamera
- Finally, change some code as mentioned in the official docs
Afterwards, I finally had a working Mixed Reality application. For testing purpose, I built the ARVR Sample from Vuforia and ran it on my OnePlus One inserted to a Google Cardboard. The result was amazing. The application managed to detect image, augment it with a 3D object and with its gaze casting function, I can jump to VR mode by looking at a designated button. Wow. If desired, I could also jump back to AR mode by looking down and activating the appearing button.
Trying out Vuforia mixed reality sample. Gorgeous! I can jump from AR to VR instantly. Rad! pic.twitter.com/O9ZaE7xWCs
— Adityo Pratomo (@kotakmakan) August 3, 2015
However, I did find 2 things that bugged me at the moment. First is the phone gets very hot when doing this. Probably because it’s doing too many things at once: camera, image detection, 3d rendering, stereo rendering, among others. Don’t get me wrong, the experience was smooth, no judder whatsoever. I believe that’s also because the application uses low poly assets.
The second one is I’d like to see the buttons and the scene manager packaged as a prefab, much like the core assets of Vuforia. Though I can do with having the samples as a jumpstart, I believe that having it as a reusable resource can help speed up the development.
All in all, this is a very solid SDK and one to keep an eye on. I’m excited by its prospect and I hope I can make something with it in a very near future.