top of page

Medical Augmented Reality on the Battlefield (ARTEMIS)

Official Website: ARTEMIS

This project is to support the use of AR technologies as a technological platform to guide first responders, whether they are trained or untrained, to assess and treat injuries on the battlefield. The project is a comprehensive software architecture that involves multidisciplinary areas of AR/VR user interface development, deformable computer vision tracking and fusion, multi-model sensor (i.e. depth camera, RGB, inertial, etc.) measurements to create high situational awareness between the mentor and the responder. Key innovations that ARClab provides is in:


  • Soft tissue tracking in camera

  • Reconstruction of 3D scenes from multiple RGB-D sources

  • GPU-accelerated dynamic fusion of the surgical scene

  • Bilateral communication of data between AR/VR equipment and camera sensors.


Our hardware platforms that are incorporated into the framework include:

  • Intel Realsense

  • Microsoft Kinect Azure

  • Microsoft Hololens

  • HTC Vive Pro

  • Magic Leap One

  • Optitrack


This project is in tight collaboration with the Dr. Nadir Weibel in Computer Science and the Weibel Lab, and clinical collaboration with the Navy Balboa Medical Center with lead PI CAPT Konrad Davis. The project is funded under USMRMC Defense Health Project funding.



Students and Collaborators

Weibel Lab

Jingpei Liu

Albert Liao




ARTEMIS: Mixed-Reality Environment for Immersive Surgical Telementoring

N. Weibel, J. Johnson, T. Sharkey, Z. Robin Xu, E. Zavala, K. Davis, D. Gasques, X. Zhang, M.C. Yip

Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, 2020.

ARTEMIS: Mixed-Reality Environment for Immersive Surgical Telementoring
N. Weibel, D. Gasques, J. Johnson, T. Sharkey, Z.Q. Robin Xu, X.M. Zhang, M.C. Yip, K. Davis
Computer Human Interaction (CHI), 2020 Demonstrations Paper. (Accepted).

bottom of page