
Harris 3D: A Robust Extension of the Harris Operator for Interest Point Detection on 3D Meshes. Visual Assist with a Laser Pointer and Wearable Display for Remote Collaboration. R: A Language and Environment for Statistical Computing. Grasp-Shell vs Gesture-Speech: A Comparison of Direct and Indirect Natural Interaction Techniques in Augmented Reality. Voodoo Dolls: Seamless Interaction at Multiple Scales in Virtual Environments. KinectFusion: Real-time Dense Surface Mapping and Tracking. Ownership and Control of Point of View in Remote Assistance. Spatial Workspace Collaboration: A SharedView Video Support System for Remote Collaboration Capability. Immersive 3D Environment for Remote Collaboration and Training of Physical Activities. Remote Collaboration Using a Shoulder-Worn Active Camera/laser. The Effects of Remote Gesturing on Distance Instruction. Improving Co-Presence with Augmented Visual Communication Cues for Sharing Experience through Video Conference. Augmented Reality in the Psychomotor Phase of a Procedural Task. Visual and spatial reasoning in design III 3 (2004), 69-78. Task Support System by Displaying Instructional Video Onto AR Workspace. World-Stabilized Annotations And Virtual Scene Navigation For Remote Collaboration. Studies on the Effectiveness of Virtual Pointers in Collaborative Augmented Reality. T.A.C: Augmented Reality System for Collaborative Tele-assistance in the Field of Maintenance Through Internet. "Where Are You Pointing At?" A Study of Remote Collaboration in a Wearable Video conference System. Object-Based Touch Manipulation for Remote Guidance of Physical Tasks.

In addition, the 3D pointing approach was faster than the 2D tablet in the case of a highly trained expert.

The study showed the 3D demonstration approach to be faster than the others. We performed a user study of a 6DOF alignment task, a key operation in many physical task domains, comparing both approaches to an approach in which the expert uses a 2D tablet-based drawing system similar to ones developed for prior work on remote assistance. In another approach, the expert demonstrates actions in 3D by manipulating virtual replicas, supported by constraints and annotations. In one approach, the expert points in 3D to portions of virtual replicas to annotate them. This can be especially useful for parts that are occluded or difficult to access. Both approaches allow the expert to create and manipulate virtual replicas of physical objects in the local environment to refer to parts of those physical objects and to indicate actions on them. We introduce two approaches that use Virtual Reality (VR) or Augmented Reality (AR) for the remote expert, and AR for the local user, each wearing a stereo head-worn display. However, effective spatial referencing and action demonstration in a remote physical environment can be challenging.

In many complex tasks, a remote subject-matter expert may need to assist a local user to guide actions on objects in the local user's environment.
