enFaced - Virtual and Augmented Reality Training and Navigation Module for 3D-Printed Facial Defect Reconstructions


This interdisciplinary research project between computer science and medicine targets the development of a comprehensive image-guided tool for head and neck surgery with the main focus on mandibular fractures and mid-facial fractures. The tool supports the physicians in all treatment stages (diagnosis, planning, intervention and monitoring) and can also be used during clinical traineeship. A key point is the development of an algorithm for the semi-automatic and automatic segmentation of bone structures and soft tissue in CT and MRI acquisitions. A segmentation enables a three-dimensional localization, quantification and visualization of biological structures in a very short time. Based on the radiological images from the clinical routine, treatment diagnosis can be visualized, and surgical planning processes or alternative surgery options can be simulated. In addition, the postoperative simulation results can be presented preoperatively in a photo-realistic manner. In particular, this will be applied in the reposition of bony structures, the reconstruction of facial defects and the removal of tumors in complex anatomical areas. Moreover, we plan the development of patient-individual three-dimensional implants, which will 3D printed in-house for comparison with the external manufactured ones and for clinical usage. Furthermore, the medical personnel can be supported in real-time during surgery by an interactive navigation, where Augmented Reality integrates computer-generated objects into the operation field. The application reduces surgical complications and ensures a successful treatment result by pre- and intraoperative simulations. Hence, the number of necessary corrective surgeries and the operation time needed can be reduced, and a higher patient survival rate can be achieved. In addition, the combination with Virtual Reality glasses allows the virtual training of operations during medical education. The objective of the project is the establishment of an open source tool as basis for further research and developments for the participating universities. Currently, comparable tools are protected by strict licensing conditions and not accessible without restrictions for academic research. Existing software for clinical practice is not functionally stable enough and has too high error rates. Additionally, the usage imposes high financial costs, which makes a widespread application not possible. Finally, Augmented Reality in the clinical routine is an important topic for current surgical research and cannot be ignored by modern medical universities.


TUG: Dieter Schmalstieg

MUG: Jürgen Wallner, Mauro Pau, Wolfgang Zemann, Norbert Jakse

TUG/MUG: Christina Gsaxner (Meduni Link), Jan Egger (Meduni Link)




If you are interested in doing a Bachelor Thesis, Master Project / Seminar or Master Thesis within this project, do not hesitate to contact us! Note: Biomedical Engineering Students are welcome!

Computer Algorithms for Medicine Laboratory

MICCAI Runners-Up

Gsaxner C, Pepe A, Wallner J, Schmalstieg D, Egger J. Markerless Image-to-Face Registration for Untethered Augmented Reality in Head and Neck Surgery. MICCAI, pp. 236-244, Oral (~3% Acceptance Rate) and Runners-up (15 Finalists out of 1729 Submissions, <1%), 2019.

Best Paper Award

Pepe A, Trotta GF, Wallner J, Gsaxner C, Schmalstieg D, Egger J, Bevilacqua V. IEEE BMEiCON, Best Paper Award 2018. The 11th Biomedical Engineering International Conference, Chiang Mai, Thailand, 2018. Certificate

Selected Publications

Wallner J, Mischak I, Egger J. Computed tomography data collection of the complete human mandible and valid clinical ground truth models. Sci Data. 2018; 6:190003. Open Access

Pepe A, Trotta GF, Wallner J, Gsaxner C, Schmalstieg D, Egger J, Bevilacqua V. Pattern Recognition and Mixed Reality for Computer- Aided Maxillofacial Surgery and Oncological Assessment. 11th Biomedical Engineering International Conference (BMEiCON), Chiang Mai, Thailand, pp. 1-5, 2018. IEEE

Gsaxner C, Pfarrkirchner B, Lindner L, Pepe A, Roth PM, Wallner J, Egger J. PET-Train: Automatic Ground Truth Generation from PET Acquisitions for Urinary Bladder Segmentation in CT Images using Deep Learning. 11th Biomedical Engineering International Conference (BMEiCON), Chiang Mai, Thailand, pp. 1-5, 2018. IEEE

Egger J, Pfarrkirchner B, Gsaxner C, Lindner L, Schmalstieg D, Wallner J. Fully Convolutional Mandible Segmentation on a valid Ground- Truth Dataset. Conf Proc IEEE Eng Med Biol Soc. 2018; 656-660. PubMed

Schwaiger M, Wallner J, Pau M, Feichtinger M, Zrnc T, Zemann W, Metzler P. Clinical experience with a novel structure designed bridging plate system for segmental mandibular reconstruction: The TriLock bridging plate. J Craniomaxillofac Surg. 2018; 46(9):1679-90. PubMed

Wallner J, Hochegger K, Chen X, Mischak I, Reinbacher K, Pau M, Zrnc T, Schwenzer-Zimmerer K, Zemann W, Schmalstieg D, Egger J. Clinical evaluation of semi-automatic open-source algorithmic software segmentation of the mandibular bone: Practical feasibility and assessment of a new course of action. PLoS One. 2018;13(5):e0196378. Open Access

Bruneder S, Wallner J, Weiglein A, Kmecova L, MD, Egger J, Pilsl U, Zemann W. Anatomy of the Le Fort I segment: Are arterial variations a potential risk factor for avascular bone necrosis in Le Fort I osteotomies? J Craniomaxillofac Surg. 2018; 46(8):1285-95. PubMed