Creating AR Content

  • iPad

Content creators can create Augmented Reality (AR) content for CLM to display on iPad devices, enabling more creativity when displaying product metrics, effects, and information. AR overlays virtual objects, known as models, onto a lens of the real world, known as the scene. Augmented Reality differs from Virtual Reality (VR) in that AR uses a blend of virtual objects and the user's real surroundings, rather than completely replacing the surroundings with a virtual environment.

Veeva CRM supports ARKit 3.0.

Guidelines

The font, location, and size of all headings or labels cannot be formatted.

Models

All models must use the .dae, .obj, or .usdz file formats. If the model uses animations, the .dae or .usdz format must be used. For optimal performance, an individual model must not exceed 100,000 polygons.

Veeva recommends limiting a scene to less than 10 models.

Animations

IDs must be set for each animation to be referenced in the JSON file.

Animations should have recognizable names describing the associated object, for example, Disposable_Syringe, Iris, Left_Artery, etc.

Masks

Masks enable all or a portion of a model to be hidden when viewing the model from a certain angle. Masks must be a 3D mesh and can be any shape. The mask must be identified in the JSON file.

Use only one mask per model.

Textures

Textures can be created and edited using a raster graphics editor. For optimal performance, use textures to portray details on a model rather than adding polygons.

Limit each model to one set of 2048 by 2048 textures.

Lighting

  • Lighting in Edit Mode – There are three nodes in edit mode:
  • a ring to display the model direction
  • a round shadow layer on the plane below the model
  • a directional light above the model so the model's shadow displays beneath
  • Lighting in Locked Mode – Specific lighting does not display in locked mode. The only lighting in the scene is ARKit's default ambient lighting which is automatically adjusted by the lighting of the overall environment. Any additional lightning needs to be added in the model itself.

Content creators can customize scene lighting. See Customizing AR Lighting for more information.

Preparing Files

After optimizing and testing a model and its associated textures, normal maps, audio, and animations, place the model and its assets in a folder.

For example, the folder Eye contains the model file for a human eye along with all other assets used by the model.

Repeat this process for all models in an AR scene.

Creating AR Models

When designing 3D models, it is good to know that Xcode scenes use the Y-up axis. The position of the origin of the axes should be set to match the use of the model. For example, if the model is expected to be sitting on a horizontal surface, the origin should be at the bottom of the model. The size of the model should also be determined depending on the environment it is used in.

Building Interactivity and Animation into Models

To create 3D models for Veeva AR:

  • Modeling: Veeva recommends Autodesk Maya 2017 or later. Other software may also work as long as it supports the .dae file format. The entire scene should max out at around 1 million polygons. Users may experience longer load times if there is too much detail and too many polygons. Rendering speeds can vary widely depending on the types of models created. It is important to test the rendering speed of the model on the target devices within XCode before finalizing.
  • Animations are created during the modeling process. Each animation must be referenced by ID in the JSON file.
  • Masking: If you are using masking, the mask must be a 3D mesh. It can be any shape as long as it covers everything that needs to be masked. Once exported and brought into SceneKit, the mesh must be set as a mask. While loading, the material is changed to a default SceneKit material with almost 0 transparency and a rendering order higher than the default value.
  • Exporting: Export the file from Maya as a DAE file instead of an OBJ file if you have animations in the model. The OBJ file format does not export animations. DAE files export slowly. Find an alternate converter to expedite the process.
  • Texturing and Normal Maps: You can use Photoshop to create or alter the textures and normal maps of a 3D object. Instead of creating minute details on the models (wrinkles, scratches, etc.) as polygons, it is better to use the normal map to handle these. This enables you to create more realistic models without placing additional load and rendering burdens on the device. To make details stand out, increase the height of the detail on the normal map. Normal maps, together with other textures, must be sent to ARKit separately, as DAE files do not export them. Specify the name of the normal map file in the JSON file to load it with the model.

Watch Apple’s video on Creating Great AR Experiences to learn more about creating 3D content.

Optimizing Models Using Xcode Script

Before you can use your model in Veeva AR, it needs to be optimized to work properly. To do this:

  1. Export the model from the 3D Model editor (Maya) as a [model].obj file (no animations) or as a [model].dae file (includes animations).
  2. Place the model, all textures, audio files, and the overlay image into a folder, for example [model].scnassets.
  3. Run the following script from the [model].scnassets folder to create an optimized file for Xcode:
  4. run /Applications/Xcode.app/Contents/Developer/usr/bin/scntool --convert [model].dae --format c3d --output [new model name].dae --force-y-up --force-interleaved --look-for-pvrtc-image
  5. Create an ARKit template project and import the model into the project to test the model, get and check the animation names, textures, etc.
  6. Zip the folder with only the optimized file, audio, overlay image, and textures, not the original .dae file.
  7. Configure the JSON file. See Defining AR Scenes for more information.

See this blog post for more information about optimizing files with SceneKit.