Technical Digest


Tech Digest Vol.35 Num.3 Cover

Exploring Immersive Technology at APL
Volume 35, Number 3 (2020)

Immersive technologies, including virtual and augmented reality, have been in use in a variety of application areas, such as gaming, teaming, training, health care, and engineering, for quite some time. Today these technologies have coalesced into a vast and growing domain, often referred to as XR, with innovative applications and use cases in nearly every sector, from gaming and entertainment to real estate, retail, architecture, and education, to name just a few. This recent explosion in the XR ecosystem, including the introduction of more capable commercial products and improved software development tools, prompted APL staff members to explore new research and development efforts leveraging XR. This issue highlights several of these efforts.

Launch the XR Experience! Download and print instructions.

Exploring Immersive Technology at APL: Guest Editor’s Introduction

Nicholas W. DeMatt

Immersive technologies, including virtual and augmented reality, have been in use in a variety of application areas, such as gaming, teaming, training, health care, and engineering, for quite some time. In fact, the Johns Hopkins University Applied Physics Laboratory (APL) has been exploring these technologies for more than two decades. Today these technologies have coalesced into a vast and growing domain, often referred to as XR, with innovative applications and use cases in nearly every sector, from gaming and entertainment to real estate, retail, architecture, and education, to name just a few. This recent explosion in the XR ecosystem, including the introduction of more capable commercial products and improved software development tools, prompted APL staff members to explore new research and development efforts leveraging XR. This issue highlights several of these efforts.

Overview of Immersive Technology: Terminology, State of the Art, and APL Efforts

Scott D. Simpkins, Patrick D. Allen, and Nicholas W. DeMatt

Immersive technologies have roots dating back to the 1800s. The Johns Hopkins University Applied Physics Laboratory (APL) has been exploring how to use these technologies to meet critical national needs for decades. Today these technologies constitute an entire domain, replete with its own lexicon, and commercially available tools have become more capable and less expensive. There are use cases for immersive technologies in just about every field, from the entertainment industry to health care to defense. This article reviews the history of immersive technologies, clarifies some of the terminology used to describe the technologies, and presents the current state of the art. It then presents 15 examples of APL work in a wide range of application areas including intelligence, military, first responders, medical, space, human factors, education, and research, and it concludes with some additional use cases.

An Asset Pipeline for Creating Immersive Experiences: From CAD Model to Game Engine

Blake A. Schreurs

When an object is designed for manufacture and use, it is vital that various members of the design team, as well as the object’s end users, be able to fully comprehend the purpose and value of the designed object. Engineers at the Johns Hopkins University Applied Physics Laboratory (APL) call this “experiencing design intelligence.” When designing objects for use by people, following human-centered design practices will help the design team leverage the experience of those people to inform the object being created. Computer-aided design (CAD), in use for decades, has revolutionized the design and manufacturing processes for objects ranging from toys to buildings to spacecraft. CAD tools provide a wealth of information that is essential for many modern operations, and today CAD data can be combined with newer technologies to create immersive experiences that provide even more information and lead to even greater design intelligence for both design teams and end users. This article presents a nominal asset pipeline, including best practices, for taking a CAD model into a game engine to create an immersive experience.

Developing Project Proto-HEAD (Prototype Holographic Environment for Analysis of Data)

James L. Dean

With Project Proto-HEAD, or Prototype Holographic Environment for Analysis of Data, a Johns Hopkins University Applied Physics Laboratory (APL) team sought to learn more about augmented reality (AR) and whether it significantly improves simulation and data analysis over using a PC workstation. Proto-HEAD had a simple concept: given a 3-D data set, visualize that data set as a hologram within the physical world. The development process afforded the team significant insight into the challenges and potential approaches for visualizing and interacting with very large data sets. These insights are useful for new applications that visualize and interact with large data sets and complex models.

Project Minard: A Platform for War-Rooming and Geospatial Analysis in Virtual Space

Stephen A. Bailey, Justin R. Renga, Joseph T. Downs, Miller L. Wilt, Brock A. Wester, and Jordan K. Matelsky

The geographic movement of individuals and assets is a complicated maze of relational data, further complicated by the individuals’ relationships or allegiances to organizations and regions. Understanding this depth and complexity of information is difficult even on purpose-built systems using conventional compute architectures. A Johns Hopkins University Applied Physics Laboratory (APL) project, called Minard, upgrades the war room to a virtual reality (VR) space. This system provides analysts with a collaborative and secure virtual environment in which they can interact with and study complex and noisy data such as alliances, the transit of individuals or groups through 3-D space, and the evolution of relationships through time. APL engineers designed intelligent visualization systems to bring the best of human intuition to state-of-the-art VR, with human–machine teams interacting both through the VR headset and behind a conventional computer terminal.

Minerva: Applied Software Engineering for XR

Blake A. Schreurs

This article describes Minerva (Multiuser Intuitive Exploitation and Visualization), a proof of concept demonstrating a dynamic multiuser alternative to traditional file-oriented intelligence processing and dissemination pipelines. The main goal of the effort was to enable multiple clients to look at the same data in the ways that worked best for them depending on the type of device they were using. Although Minerva did not evolve into a fully developed software product, the team of developers at the Johns Hopkins University Applied Physics Laboratory (APL) created a functional prototype and demonstrated effective use of standards and microservices for hosting and streaming static content. APL teams are applying the lessons learned from this effort to other projects seeking to take advantage of XR to improve traditional data production and representation pipelines.

Mixed Reality Social Prosthetic System

Ariel M. Greenberg and Jason A. Spitaletta

A Johns Hopkins University Applied Physics Laboratory (APL) team conceived of and developed a first-of-its-kind mixed reality “social prosthetic” system aimed at improving emotion recognition training and performance by displaying information about nonverbal signals in a way that is easily interpretable by a user. Called IN:URfACE (for Investigating Non-verbals: Using xReality for the Augmented Consideration of Emotion), the proof-of-concept prototype system uses infrared sensors to measure facial movements, pupil size, blink rate, and gaze direction. These signals are synchronized in real time, registered in real space, and then overlaid on the face of an interaction partner, such as an interviewee, through a mixed reality headset. The result is dramatic accentuation of subtle changes in the face, including changes that people are not usually aware of, like pupil dilation or nostril flare. The ability to discern these changes has applications in fields such as law enforcement, intelligence collection, and health care. This article describes how the system works, the technical challenges and solutions in designing it, and possible areas of application.

Simulated X-Ray Vision Using Mixed Reality

Stephen A. Bailey, Miguel A. Rufino, Rodrigo-Rene R. Munoz-Abujder, and Hirsh R. Goldberg

A Johns Hopkins University Applied Physics Laboratory (APL) team created an application for the Microsoft HoloLens, a mixed reality (MR) head-mounted display (HMD), that serves as a proof of concept for a capability that would allow warfighters to observe their surroundings beyond their immediate line of sight. The approach uses a high-fidelity 3-D reconstruction of the beyond-line-of-sight (BLOS) environment in the form of a point cloud based on data from remote sensors and overlays it with the physical surfaces in the user’s surroundings as seen through the HMD. This approach allows the user to observe areas beyond their line of sight without ever physically occupying or directly observing the space.

Mixed Reality for Post-Disaster Situational Awareness

James P. Howard II, Arthur O. Tucker IV, Stephen A. Bailey, James L. Dean, Michael P. Boyle, Christopher D. Stiles, and William C. Woodcock

When disaster strikes, what once was, no longer is; and what is, is unrecognizable. The ability to understand the way things were is critical for those working in the response, rescue, and recovery phases of disaster management, from first responders to insurance claims agents. A team at the Johns Hopkins University Applied Physics Laboratory (APL) is developing a system that uses 3-D modeling data and precise positioning data from GPS to display an image of structures “in place” using a mixed reality head-mounted display for first responders in a disaster scenario. The proof of concept revealed significant challenges that need to be solved, and we have already proposed solutions and are working to test them. This technology is designed to assist first responders but will also have applications in other areas that require real-time data presentation, such as battlefield situational awareness.

Novel Perception

Ariel M. Greenberg

The Novel Perception system enhances users’ awareness of the world by providing them naturalistic access to sources of information beyond what is perceivable by the basic human senses. The system, conceived of and designed by a team at the Johns Hopkins University Applied Physics Laboratory (APL), collects signals from a variety of sensors, synchronizes them in real time, registers them in real space, and then overlays them onto the real world as imagery and holograms seen through a mixed reality (MR) headset. The concept includes “virtual lenses” of hyperspectral, radio frequency, social, physiological, thermal, and radiological/nuclear overlays, so that a user can select multiple virtual lenses to create on-the-fly custom compound lenses across these modalities. Because the volume and velocity of the data streaming from these sensor modalities may be overwhelming, the system is envisioned to leverage artificial intelligence and brain–computer interfaces to sculpt the deluge per the operational tasks at hand. This article describes the approach to developing the system as well as possible applications.

Design and Preliminary Evaluation of an Augmented Reality Interface Control System for a Robotic Arm

David P. McMullen, Matthew S. Fifer, Kapil D. Katyal, Robert Armiger, Guy Hotson, James D. Beaty, Albert Chi, Daniel B. Drachman, and Brock A. Wester

Despite advances in the capabilities of robotic limbs, their clinical use by patients with motor disabilities is limited because of inadequate levels of user control. Our Johns Hopkins University Applied Physics Laboratory (APL) team and collaborators designed an augmented reality (AR) control interface that accepts multiple levels of user inputs to a robotic limb using noninvasive eye tracking technology to enhance user control. Our system enables either direct control over 3-D endpoint, gripper orientation, and aperture or supervisory control over several common tasks leveraging computer vision and intelligent route-planning algorithms. This system enables automation of several high-frequency movements (e.g., grabbing an object) that are typically time consuming and require high degrees of precision. Supervisory control can increase movement accuracy and robustness while decreasing the demands on user inputs. We conducted a pilot study in which three subjects with Duchenne muscular dystrophy completed a pick-and-place motor task with the AR interface using both traditional direct and newer supervisory control strategies. The pilot study demonstrated the effectiveness of AR interfaces and the utility of supervisory control for reducing completion time and cognitive burden for certain necessary, repeatable prosthetic control tasks. Future goals include generalizing the supervisory control modes to a wider variety of objects and activities of daily living and integrating the capability into wearable headsets with mixed reality capabilities.

HoloLens Applications for the Demonstration of an Advanced Anthropomorphic Test Device for Under-Body Blast and the Dissemination of Finite Element Analysis Results

Nicholas A. Vavalle, Nathanael P. Kuo, and Catherine M. Carneal

The Warrior Injury Assessment Manikin (WIAMan) crash test dummy was developed in response to an Army need for better injury prediction capabilities in under-body blast testing. Concurrently, a finite element model (FEM, a physics-based computational model) was developed at the Johns Hopkins University Applied Physics Laboratory (APL) to accelerate the design process and provide simulated injury prediction capabilities. However, two main issues arose when presenting the work to a broad audience. First, it was difficult to convey the results and impact of the FEM. Augmented reality suited this problem well because of the technical nature of the work and the lack of an off-the-shelf software solution the layperson could use to manipulate the model. Second, it was necessary to circumvent the need for transporting the human-sized physical anthropomorphic test device (ATD). A new method for exploring the dummy was required, but one that still allowed an audience to experience the technology firsthand. Hence, two complementary Microsoft HoloLens applications were developed at APL to allow a user to explore the inner workings of the ATD and see it in a simulated blast environment. These apps connect the user with the project in their own surroundings while providing information about various ATD parts at the user’s pace. The applications have been demonstrated to diverse audiences at various venues both locally and across the country and were successful in conveying the impact of the project.

XR for Advanced Prototyping of Spacecraft Mechanical Systems

Devin J. Hahne

This article discusses how teams in the Space Exploration Sector at the Johns Hopkins University Applied Physics Laboratory (APL) are using XR as an advanced prototyping capability. Prototypes enable engineering and design teams to see or experience an object before committing resources to full-scale production. XR provides a means of digitally prototyping high-fidelity physical information in a nonmaterial form. The collaborative immersive qualities of XR appeal to human visual processing senses, enabling teams to quickly engage complex system information, make decisions, and confidently move from ideas to actions. The design intelligence gained through using XR enables teams to make faster decisions with greater confidence and less risk. APL teams introduced production-grade XR tools into mechanical design workflows in 2017, making critical contributions to Parker Solar Probe, Europa Clipper, and other programs. But XR is only a small part of a bigger picture challenging companies to rethink conventional business operations for the modern competitive global industrial ecosystem. Incorporating XR as part of a broader digital transformation (DX) strategy carves a path to greater advantages and opportunities that cannot be realized by XR alone.

ARMOUR X: Augmented Reality Mission Operations UseR eXperience

Robert A. Berardino and Arthur O. Tucker IV

When developing ARMOUR X (Augmented Reality Mission Operations UseR eXperience), a Johns Hopkins University Applied Physics Laboratory (APL) team set out to explore whether mixed reality technologies can help military space professionals realize a common operating picture. Through an independent research and development (IRAD) grant from APL’s National Security Space Mission Area, the team sought to address the present-day problem of communicating large quantities of data and information with outdated presentation and visualization modalities. Through the use of mixed reality technologies (e.g., augmented reality and virtual reality) to create immersive experiences, the system aims to facilitate effective decision-making in operational environments with narrowing tactical timelines, such as space. During its three consecutive years of funding (FY2017–2019), the ARMOUR X team accomplished most of its engineering objectives, created a portable demonstration system for stakeholder engagement, and acquired stakeholder community feedback to inform system design decisions and future directions.

Architectural Design in Virtual Reality

Michael P. Boyle, James L. Dean, and William J. Kraus

B201, one of the most recent construction efforts at the Johns Hopkins University Applied Physics Laboratory (APL), is a 263,000-square-foot building with a 647-person occupancy. As part of change management efforts, the Research and Exploratory Development Department (REDD) created an application allowing future B201 occupants to lead themselves on a virtual tour through the under-construction building. Beginning in late 2018, the team did a road show of the final product, which offered staff members the option to either point and click through the space on their computers or to take a virtual reality (VR)-based tour of their new accommodations.

CONVEY: Connecting STEM Outreach Now Using VIE Education for Youth

Brock A. Wester, Andrew R. Badger, Matthew S. Fifer, Elise D. Buckley, Daniel M. Portwood, James D. Beaty, and M. Dwight Carr

As part of the CONVEY (Connecting STEM Outreach Now Using VIE Education for Youth) program, a multidisciplinary Johns Hopkins University Applied Physics Laboratory (APL) team designed a mixed reality workshop to provide experiential instruction to children in families with wounded warriors. The goal of the workshop was to improve participants’ understanding of their family members’ conditions; of specific topics in biology, anatomy, and engineering; and of current and future rehabilitative technologies. The hope was that this increased and personalized understanding might motivate them to pursue careers in science, technology, engineering, and mathematics (STEM). This effort, commissioned by the Office of Naval Research, leveraged both traditional learning methods and immersive technologies. Using a modified version of the VIE (which stands for Virtual Integration Environment)—the virtual training platform APL developed to help amputees quickly adapt to operating its revolutionary Modular Prosthetic Limb—the team created a number of scenarios in virtual and mixed reality to enhance the lesson-based activities. This article outlines the approaches for developing these immersive scenarios, documents the technologies and capabilities used, and presents the program’s measures of effectiveness.

ESCAPE with PARTNER: Experimental Suite for Cooperatively Achieved Puzzle Exercises with the Platform Assessing Risk and Trust in Nonexclusively Economic Relationships

Julie L. Marble, Ariel M. Greenberg, Sean M. Kain, Brandon J. Scott, and Mary E. Luongo

Trust is a socio-emotional construct used to explain why one is willing to be vulnerable to the variable and unpredicted actions of another independent actor. Virtual reality (VR) provides an excellent test platform for allowing researchers to assess trust, because it provides a safe environment yet participants can be made to feel vulnerable. In the research described in this article, we developed a VR-based game to assess trust between humans and robots in a collaborative task. This article describes the development of and first experiments with this experimental platform developed to explore humans’ trust of bots.

Inside Back Cover

To provide insight into whether the literature (news articles) suggests benefits of XR, Digital Transformation, and Industry 4.0, APL’s Christina K. Pikas (BS, MLS, PhD, Librarian and Service Manager) did some analysis in January 2020 using Quid. Quid reveals “connections, trends, and insights that will help you understand the story behind your customers, competitors, markets . . . and brand[s]” (https://quid.com/). A search on the terms (“Digital Transformation” OR “XR” OR “augmented reality” OR “virtual reality” OR “mixed reality” OR “Model-based design” OR “Model-based engineering” OR “Industry 4.0” ) AND (“success” OR “ROI”) resulted in the two maps above. The lower, more colorful one is a map of the news article network with 4277 stories colored by clusters and sized by degree. The upper map traces sentiments related to those same news articles colored by sentiment and sized by degree. The striking revelation in this analysis is the overwhelming positivity of the discussion around XR.