How to make physical objects come to life on hololens

Vuforia is a software development kit that allows the user to take a 2D image, or even a 3D object, in the real world and, through an augmented reality device of some sort (in our case, a HoloLens), the software will overlay rendered images on top of the chosen image or object. Vuforia will track the position and rotation of the real world object and the overlaid graphics will move along with the object. Vuforia is uses a picture that they call the ImageTarget—a graphical or visual pattern. It can be a 2D image or a simple 3D object. Once the computer recognizes this pattern, it is used as an anchor to overlay computer rendered graphics on top of it.
Developing Vuforia Engine Apps for HoloLens Vuforia Engine enhances the capability of HoloLens by allowing you to connect AR experiences to specific images and objects in the environment. You can use this capability to overlay step by step instructions on top of machinery or to add digital features to a physical product.
Based on the CAD data of your product or machinery the Hololens can detect these 3D objects robustly in the environment using Vuforia Model Targets technology. Augmented overlays can be authored directly against the CAD model so that your field workers see them right where you placed them.

Enterprise developers can use VuMarks to uniquely identify each piece of machinery on a factory floor, right down to the serial number. Additionally, VuMarks can be scaled into the billions and designed to look just like a company logo. Existing Vuforia Engine apps built for phones and tablets can be configured in Unity to run on HoloLens. You can even use Vuforia Engine to take your new HoloLens app to Windows 10 tablets such as the Surface Pro 4 and Surface Book. Vuforia Engine automatically fuses the poses from camera tracking and HoloLens's spatial tracking to provide stable target poses independent of whether the target is seen by the camera or not. Since the process is handled automatically, it does not require any programming by the developer.

The following is a high level description of the process:
1. Vuforia Engine's Tracker recognizes the target.
2. Target tracking is then initialized.
3. The position and rotation of the target are analyzed to provide a robust pose estimate for HoloLens.
4. Vuforia Engine transforms the target's pose into the HoloLens spatial mapping coordinate space.
5. HoloLens takes over tracking if the target is no longer in view. Whenever you look again at the target, Vuforia will continue to track the images and objects accurately.

Targets that are detected, but no longer in view, are reported as EXTENDED_TRACKED. In these cases, the Default Traceable Event Handler script that is used on all targets continues to render augmentation content. The developer can control this behaviour by implementing a custom traceable event handler script. The best way to understand the structure of a Vuforia Engine HoloLens project is by installing and building the Vuforia HoloLens sample project. The sample provides a complete HoloLens project that includes pre-configured deployable scenes and project settings. Running the project will provide you a starting point and reference for your own Vuforia HoloLens apps.