Mixed Reality

Tutorial I


The 2003 International Symposium on Collaborative Technologies and Systems (CTS’03)

January 19 - 23, 2003



Mixed Reality:

Blending the Real, Virtual and Imagined


Charles E. Hughes

University of Central Florida

Orlando, Florida, USA




Mixed Reality (MR) refers to experiences that lie strictly between the purely virtual and the unaltered real world.  Experiences centered in the real world, with virtual augmentation, are referred to as Augmented Realities (AR).  Those on the opposite extreme are referred to as Augmented Virtualities (AV).  Both extremes will be covered in this presentation, with the added twist of considering the role of imagination as central to MR experiences. 

MR environments commonly use see-through Head Mounted Displays (HMDs) to alter the user’s visual experience.  Such HMD’s can be optical see-through (the real world is seen directly with synthetic overlays) or video see-through (the real world is captured by cameras on the front side of the HMD, altered to add effects including synthetic imagery and then sent back to displays on the back side of the HMD).  In addition to HMD’s, MR environments require tracking technologies to determine the position and orientation of users (and other real objects), registration algorithms to determine the relative positions of real and virtual objects, and rendering algorithms to manage such issues as mutual occlusion and proper illumination.  Additionally, MR environments often include spatial sound and show effects devices.  Spatial sound provides a richer immersive experience and is critical to training scenarios.  Show effects devices, commonly used in theatrical productions, allow virtual objects to affect the behaviors of real objects (real affecting virtual is much easier). 

This presentation will discuss MR technologies and research problems, focusing on applications ranging from military training to entertainment.  Specific examples of technologies will include the Canon COASTAR video see-through HMD, an optical see-through HMD developed at UCF’s School of Optics, a variety of tracking devices (all off the shelf) and a number of show control special effects.  Applications will include training for military operations in urban terrain, distributed collaborative environments, multi-person immersive entertainment and informal science education.  



Scientists, engineers and practitioners interested in state of the art in mixed and augmented realities and their applications to collaborative systems.



Basic science and engineering, programming and/or modeling and simulation. 



2 Hours



Power Point using LCD projector.  Tutorial notes will be made available. 



Charles E. Hughes is Professor of Computer Science in the School of Electrical Engineering and Computer Science at the University of Central Florida (UCF), Orlando, Florida.  He received the B.A. degree in Mathematics from Northeastern University in 1966, and the M.S. and Ph.D. degrees in Computer Science from Penn State University in 1968 and 1970, respectively.  He served on the faculties of Computer Science at Penn State University and the University of Tennessee prior to joining UCF in 1980.  His research interests are in mixed reality, distributed interactive simulation, and models of distributed computation.  He has published approximately 90 journal and proceedings papers, and received over $4M in research funding.  Most recent funding came from the Army RDE Command, Canon Mixed Reality Labs, Army STRICOM, and National Science Foundation.  He is a member of the ACM, the IEEE and the IEEE Computer Society.