Director's
Message

Video of launches from USS Gravely, USS Carney, and USS Dwight D. Eisenhower supporting strikes on Iranian-backed Houthi targets. Credit: U.S. Central Command Public Affairs

Defending the
Nation

As technological disruption and unpredictable, emerging threats redefine the global landscape, the U.S. faces daunting national security challenges. APL is leveraging its deep expertise in specialized fields to support national priorities and technology development programs. By combining creativity and technical prowess within a culture of innovation, APL is tackling the toughest challenges of our time and driving solutions with impact on multiple fronts.

Bold
Innovation

In our increasingly complex world, solutions to the most pressing problems require strategic foresight, creativity and technical expertise. APL researchers are combining these competencies and developing technologies that will shape our future — making impactful advancements in artificial intelligence (AI) and autonomy, health care, energy, manufacturing and computing.

Exploring The
Extremes

Parker Solar Probe’s closest approach to the Sun capped another year in which APL researchers pushed the capabilities of technology to provide a better understanding of the universe and our place within it. Using decades of experience in space mission management and expertise in electrical and mechanical design and fabrication, materials science, hypersonic vehicles, cislunar space and planetary defense, the Laboratory collaborated with organizations around the world to answer fundamental questions and tackle pressing threats and challenges.

Countering
Evolving Threats

As rapidly advancing technologies give rise to novel and shifting threats, safeguarding the nation requires agile, forward-thinking responses. APL draws on its longstanding strengths in systems engineering, advanced research and data-driven analysis to anticipate and mitigate these complex challenges. Through inventive thinking, deep technical expertise and a collaborative spirit of innovation, we are delivering transformative solutions that address tomorrow’s threats today.

Labs of
the Lab

Tech
Transfer

University
Collaborations

A Culture
of Innovation

Awards
and Honors

Bold
Innovation

An artist’s rendering of an AI-piloted F-16 fighter jet. This concept represents the Defense Advanced Research Projects Agency Air Combat Evolution (ACE) program’s milestone achievement in autonomous combat technology, marked by the secretary of the Air Force’s historic AI-assisted flight on May 2.

The Rise and Evolution of AI-Piloted Aviation

On May 2, 2024, the secretary of the Air Force strapped into the cockpit of an F-16 fighter jet used to test AI-assisted flight capabilities and prepared for takeoff. The historic flight demonstrated just how much progress the Defense Advanced Research Projects Agency’s (DARPA) Air Combat Evolution (ACE) program has made toward integrating AI into combat scenarios and increasing trust in autonomous combat systems.

The secretary of the Air Force flies in the X-62A VISTA in the skies above Edwards Air Force Base, California, on May 2, demonstrating unprecedented advances in air combat technology. Credit: U.S. Air Force

The team behind ACE, which includes APL, the Air Force and industry partners, had been preparing for the monumental flight for weeks. In the lead-up to and during the flight, APL-developed and other performer algorithms commanded the X-62A, a modified F-16 aircraft also referred to as VISTA (Variable In-flight Simulator Test Aircraft), at the Air Force Test Pilot School at Edwards Air Force Base, California.

APL has served as a core member of the ACE team since the 2020 AlphaDogfight Trials, a virtual showdown between eight AI research teams from across the United States. In the trials’ final round, the winning AI agent went 5–0 against a U.S. Air Force F-16 pilot. APL developed the simulation environment and adversary algorithms for the competition, allowing the industry agents to train against an increasingly capable simulated “enemy” aircraft.

A pilot immerses in virtual aerial combat during DARPA’s 2020 AlphaDogfight Trials, where APL served as a core technical contributor to the Air Combat Evolution program. In the final round, the winning AI agent defeated a U.S. Air Force F-16 pilot in all five simulated dogfights.

To ensure success in the flight test program and enable easy integration of multiple-agent designs, APL developed the ACE Distributed Operations Manager framework, which is integral to the ability of AI agents to fly the X-62A VISTA. In September 2023, ACE’s AI algorithms autonomously piloted the X-62A VISTA against another jet piloted by a human in a simulated aerial battle.

In just four years, the APL-developed algorithms have advanced from controlling simulated F-16s in aerial combat to commanding an actual aircraft in flight. The latest test only underscored the growing capabilities of this cutting-edge technology.

“The APL team developed a critical capability for DARPA to demonstrate and evaluate AI applied to air combat. From proof of concept to live flight exercises, the lessons we’ve learned will inform future Air Force AI applications,” said Mike Avera, APL project manager for DARPA ACE.

Collaborative Dynamics in Autonomous and Human–Robot Teaming

Coordinating multiple autonomous agents to work as a team remains a hard problem in robotics, but APL researchers have made significant progress toward addressing fundamental challenges.

One of them is sheer complexity. Each robot has to navigate an unpredictable environment and perform tasks while coordinating with teammates in response to dynamic situations.

In 2024, APL researchers approached this problem using topological graphs, which are commonly used to represent environments in robot pathfinding and collision avoidance. By representing important landscape features as nodes and the connections between them as edges, these graphs simplify the environment and enable efficient computation. However, when considering numerous robots traversing throughout the environment, the problem space can quickly become computationally intractable — especially in dynamic environments where the risks associated with any given move can change from one instant to the next.

This artist rendering depicts APL-developed algorithms enabling multirobot coordination across dynamic terrain. With new planning approaches for high-level tactics, APL enables robots to navigate complex terrain and collaborate effectively in real-world missions.

Joseph Moore, chief scientist for APL’s Robotics Group, led a team that addressed this issue by designing a new dynamic topological graph structure for capturing the critical features of the problem space and relationship between the robots. This compact representation allowed the researchers to leverage a technique called mixed-integer programming to rapidly compute solutions to complicated problems on the fly regardless of how many robots are involved, allowing multirobot teams to coordinate their actions with manageable complexity.

Teaming with robots was always very labor-intensive for humans — the language was controlled by humans … Now, it’s more collaborative. All the person has to do is give a command in natural language, and the robot can do the rest.

Corban Rivera, senior AI researcher at APL

The team ran this framework effectively in simulations of real-world missions. It then demonstrated the algorithms on live robots in an exercise held in Texas in June in collaboration with the Army Research Laboratory (ARL), which is funding this work.

Navigating complex environments is not the only challenge facing successful human–robot teaming. Inefficient communication is a hurdle too. While commercially available robots have become quite sophisticated, it is still difficult to control them using spoken language. To solve that problem, an international research team led by APL created a technology called ConceptGraphs. The team’s work was funded by ARL through the Army Artificial Intelligence Innovation Institute and APL’s Independent Research and Development program.

Using generative AI and advanced scene mapping, APL researchers are transforming robots into capable teammates, enhancing their effectiveness in dynamic environments like disaster response and battlefield support.

Using ConceptGraphs, robots create 3D scene graphs that compactly and efficiently represent an environment. Through training on image–caption pairs from large visual and language models (LLMs), objects in the scene are assigned tags, which help robots understand the uses of objects and the relationships between them. ConceptGraphs also enables humans to give robots instructions in plain language rather than through fixed commands and supports multimodal queries, which combine an image and a question or instruction.

“You don’t have to ask it if it sees a car — you can say, ‘Show me everything with four wheels,’ or ‘Show me everything that can carry me places,’” said Dave Handelman, a senior roboticist and a collaborator on the project.

In a real-world scenario, this might translate to a medic asking a robot to locate casualties on a battlefield and transport them to safety until the medic can attend to them.

To enhance the robots’ ability to create task-execution plans, evaluate progress and replan, the researchers also created an autonomous AI agent named ConceptAgent. ConceptAgent uses an LLM as its engine, allowing it to reason sequentially — and to write and execute its own code.

A person can give a command to the ConceptAgent as if they were speaking to another person. The robot can accomplish the task autonomously and even pivot when it hits a roadblock or makes a discovery. Researchers demonstrated this at APL by tasking a robot with identifying an injured animal and relocating it to an emergency sled. The robot passed the test.

“Teaming with robots was always very labor-intensive for humans — the language was controlled by humans, the positions of key objects were given by humans and even the execution plan was created by humans,” said Corban Rivera, a senior AI researcher and principal investigator for the project. “Now, it’s more collaborative. All the person has to do is give a command in natural language, and the robot can do the rest.”

Enhancing Health Care Through Innovation

The Lab tackled several pressing health care challenges with innovative solutions, including making significant progress toward creating a brain–computer interface (BCI) that does not require a surgical implant. The work, funded by DARPA’s Next-Generation Nonsurgical Neurotechnology program, was published in 2024, and the team illustrated that neural tissue deformations may provide a novel signal for brain activity that has the potential to be leveraged for future BCI devices.

BCI technologies work by recording and interpreting neural activity associated with a function such as speech, movement or attention, often translating it to purely mental control of an external physical device. They have vast potential benefits, but the need for a surgical implant severely limits their use — only about 50 people in the world have had a BCI implanted.

To make the devices less invasive, the team collaborated with Johns Hopkins Medicine in Baltimore, combining expertise in biomedical and underwater imaging, acoustic processing, real-time hardware and software systems, neuroscience and medical research. The result was a digital holographic imaging (DHI) system to identify and validate a neural signal from the tissue deformation — on the order of tens of nanometers in height — that occurs during neural activity.

The DHI system operates by actively illuminating the tissue with a laser and recording the light scattered from the neural tissue on a special camera. This information is processed to form a complex image of the tissue from which magnitude and phase information can be precisely recorded to spatially resolve changes in brain tissue velocity. Numerous fundamental tests were conducted over several years to ensure the signal the team identified was in fact correlated to when neurons fired.

The neural signal was challenging to identify because of competing noise from physiological clutter, such as blood flow, heart rate and respiratory rate. Dave Blodgett, a chief scientist at APL whose background is in airborne and underwater remote sensing technology development, described the challenge as a remote sensing problem, where the team needed to detect a small signal — neural activity — in a complex, cluttered environment — the brain.

APL researchers developed a high-resolution neural recording method that detects brain activity through the skull, advancing nonsurgical brain–computer interfaces with transformative potential for medical and assistive applications.

While the team’s goal was to mitigate the physiological clutter to identify the neural signal, they discovered that the clutter could also provide insight into the health of an individual. The ability to record physiological signals expands the system’s potential application — for example, the system was capable of noninvasively recording intracranial pressure, something that typically requires drilling a hole through a patient’s skull. The ability to monitor brain health from the outside could help clinicians address these challenges without invasive methods.

APL’s focus on understanding of the brain through noninvasive methods was also applied to a more clinical setting in 2024 as researchers collaborated with Johns Hopkins Medicine to make significant strides toward creating objective digital measures for assessing and tracking mental health.

Tagged TEAPOT by its APL developers, the project encompasses a telehealth suite for performing “digital phenotyping” — in this context, using standoff sensing to detect symptoms associated with psychiatric conditions, such as depression, anxiety and post-traumatic stress disorder — to provide physicians with clinically relevant data during virtual visits.

TEAPOT is not meant to replace the human practitioner but rather give them access to higher-quality, actionable information about their patients. To that end, the APL team, led by optical engineer Erika Rashka, has been collaborating closely with Peter Zandi, co-director of the Precision Medicine Center of Excellence in Mood Disorders and vice chair of Precision Medicine at the Johns Hopkins University School of Medicine, to create outcomes that really matter to psychiatrists.

In 2024, the APL team developed a set of tools that could be used to extract and analyze as much useful information as possible from telehealth video recordings. This effort called for creative, collaborative work to combine various sensing modalities; the team built upon existing models to develop a custom neural network for detecting emotional states from video data, validated machine learning algorithms specific to standoff biometric analysis, and built on earlier APL work utilizing open-source machine learning models to predict emotional states from audio data.

The team has designed a framework to integrate each of these models, ideally for clinicians to use telehealth session data to produce an objective estimate of a person’s depression, anxiety or stress level that aligns with clinical questionnaires. The TEAPOT team received funding from NASA as part of an effort to better understand how to best support the health and performance of crew members during long-duration spaceflight missions.

In a more Earth-focused application space, this year, APL researchers also developed technology to improve quality of life for people with visual impairments. In collaboration with the Johns Hopkins Whiting School of Engineering (WSE), Johns Hopkins Wilmer Eye Institute and Carnegie Mellon University, the team developed an AI-enabled navigation system that helps blind or visually impaired users navigate their surroundings with greater confidence and accuracy.

The system maps environments, tracks users’ locations and provides real-time guidance. It also processes information from depth imaging and color sensors to identify specific objects, and allows users to ask for guidance on specific aspects of their surroundings. A team of APL researchers that included Seth Billings, Francesco Tenore, Breanne Christie, Nicolas Norena Acosta, Chigozie Ewulum and Michael Pekala along with WSE’s Marin Kobilarov enhanced the basic visual feedback of current commercial systems with additional haptic, visual and auditory sensory inputs to create a more comprehensive navigation system.

The haptic feedback involves an APL-developed headband that vibrates in different places to indicate the direction of obstacles or the path the user should follow. The auditory feedback uses voice prompts and spatial sound to give verbal directions and alerts about the surroundings. The combined sensory inputs to the system are also translated into visual feedback that enhances the user’s ability to perceive obstacles and navigate effectively.

The research was presented in April at SPIE Defense + Commercial Sensing 2024 and tested in clinical trials over the summer.

An artist rendering of APL’s haptic and auditory guidance system integrated into a headband. The system provides users with directional cues through targeted vibrations and spatial audio, enabling safer navigation for the visually impaired.

Fabricating Next-Gen Wearable Electronics and Smart Textiles

Giving new meaning to the term “power shirt,” APL researchers established new, scalable methods of developing battery- and solar-powered fibers — making it theoretically possible for electrical energy to be harvested from and stored in clothing. These fibers could power high-performance wearable electronics that breathe, stretch and wash just like conventional textiles.

Today’s fiber batteries are neither easy to make nor cheap. The massive scale of the textile equipment used to manufacture fiber batteries limits it to specialized facilities too large for the battery industry. Standard fiber batteries also suffer from lower performance because the electrodes often twist together, rendering most of the electrode surface inactive.

“As demands for electronic textiles change, there is a need for smaller power sources that are reusable, durable and stretchable,” said Konstantinos Gerasopoulos, assistant program manager for Physics, Electronic Materials and Devices at APL and lead investigator of this project. “Our vision is to develop sunlight-harvesting fibers that can convert sunlight to electricity and battery fibers that can store the generated electricity in the textile.”

In a study published in May, APL scientists demonstrated that customized equipment could be used to make super-thin fiber batteries. This strategy made the process portable and suitable for large-scale production: All the equipment needed to create the fiber batteries could fit in a small room.

The batteries are made of flat strips of anode and cathode electrodes and a polymer separator that are fed together into a heated roll press and laminated into a stacked design. The construction is similar to that of conventional cell phone batteries and provides stronger power and performance than standard fiber batteries. The stack is then laser cut into a fiber-like strand roughly the width of five human hairs. This marked the first use of laser cutting on a full battery stack and demonstrates the method’s viability for customizing battery size and maintaining performance.

APL researchers fabricated tiny solar cells on flexible circuit boards, encapsulated in protective polymer, and woven with nylon to create energy-harvesting fibers for next-gen wearable electronics and smart textiles.

Starting from a specific type of solar cell that has both positive and negative terminals on the back side, they cut and assembled tiny solar cells — small enough to fit between the ridges of a fingerprint — on thin, flexible circuit boards before sealing them in a protective polymer to create a fiber-like strand. These fiber solar cells stood up to extensive mechanical and electrical testing with no loss of efficiency.

In a proof of concept, the researchers used a custom-built mini loom to weave nylon and solar cell fibers into a small textile. The swatch of fibers was placed under a lamp and attached to a small circuit board and an LED blinker, and within seconds, the strip powered the blinker’s flashing red light.

The method used to assemble and encapsulate the solar cells onto fiber substrates is also extensible to other technologies, such as sensors, LEDs or batteries mounted onto the surface of flexible fibers.

Laboratory researchers used a custom-built mini loom to weave nylon and solar cell fibers into a textile. When placed under light and connected to a circuit, the textile powered an LED blinker, demonstrating the viability of energy-harvesting fibers for wearable electronics.

Transforming Additive Manufacturing

Additive manufacturing — which encompasses a variety of fabrication techniques that build structures layer by layer — is poised to have a transformative impact on the nation’s industrial base. But its Achilles’ heel is that flaws can form which limit applicability due to uncertainty in part performance.

APL experts are addressing this challenge in several ways, including developing sensors that are fast enough to identify flaws before they materialize and prevent them from forming. Their work builds on years of research supported by internal funding as well as the Hopkins Extreme Materials Institute, the ARL, the Office of Naval Research and the Naval Nuclear Laboratory.

Multiple flaws can form during additive manufacturing, particularly in laser powder bed fusion, a method of additive manufacturing that uses lasers to melt metal powders and solidify them into complex shapes. These flaws can weaken a part’s structural integrity and are hard to spot in real time with current sensors.

Mary Daffron and Sam Gonzalez work in APL’s Additive Manufacturing Center, where advanced techniques are being developed to improve material integrity and enable real-time flaw detection, transforming the future of manufacturing.

APL researchers proposed that there was enough time to detect the beginning stages of a flaw in laser powder bed fusion and then hypothesized that it was possible to alter the manufacturing process so that the unwelcome formation could heal. By running a computational simulation, they determined that it was possible to spot and repair flaws in real time.

They first used conventional sensors to try and spot the earliest indicators of flaw formations but ran into optical resolution and speed limitations. This prompted a collaboration with WSE to enhance an APL-patented sensor that the two organizations had previously developed together. The team added photodiodes at multiple wavelengths and increased the sample frequency to measure high spatial- and temporal-resolution data on the melt pool and its dynamics, enabling them to identify the early stages of a flaw fast enough to enable real-time repair.

A unique aspect of working at APL is the ability to leverage technology from seemingly unrelated areas. For example, we can take knowledge we apply to missile defense — responding to measurement inputs very quickly and making adjustments even faster — and apply it to additive manufacturing.

Steve Storck, APL project manager and chief scientist for manufacturing technologies

With the high-speed sensor up and running, the researchers developed a control framework that enabled full control between the sensor and the laser, including the ability to tell the laser to shut off when the melt pool got too hot and was likely to create a flaw — faster than 10 millionths of a second.

Continuing to draw on expertise from across APL, the team adapted a high-speed field-programmable gate array that was originally designed to seek missiles in the sky and repurposed it as a controller in service of the goal to enable real-time repair.

“A unique aspect of working at APL is the ability to leverage technology from seemingly unrelated areas. For example, we can take knowledge we apply to missile defense — responding to measurement inputs very quickly and making adjustments even faster — and apply it to additive manufacturing,” said Steve Storck, an APL project manager and chief scientist for manufacturing technologies in APL’s Research and Exploratory Development Department.

After integrating all the components, the team successfully demonstrated the system’s ability to respond in a mere 952 nanoseconds — literally faster than the blink of an eye, and 10 times faster than required. This research is laying the groundwork as APL continues to work toward producing parts that can be trusted straight from the build.

And APL researchers are already advancing additive manufacturing for space applications, leveraging its rapid, cost-effective production of lightweight parts to drive innovation. Among their contributions is CHAPS-D, a small satellite imaging spectrometer funded by NASA’s Earth Science Division. Designed by APL atmospheric chemist Bill Swartz and his team, CHAPS-D can pinpoint pollution sources at an unprecedented resolution of half a square mile from low Earth orbit.

At NASA’s request, APL incorporated 3D-printed components into CHAPS-D’s design. The Netherlands Organisation for Applied Scientific Research (TNO), with APL’s input, used topology optimization to create the instrument’s lightweight optomechanical structure, reducing material use while maintaining strength. They overcame significant challenges with laser powder bed fusion, which required precise control over material properties influenced by trace metals, machine types and laser power.

The CHAPS-D prototype, developed with cutting-edge additive manufacturing techniques at APL, demonstrates new possibilities for creating compact and precise environmental monitoring sensors.

Leveraging prior APL-funded research and fast modeling techniques, the team developed a rapid design framework to test and evaluate multiple materials in record time. This approach, according to Storck, has made APL one of the fastest in the world at developing material properties for additive manufacturing — sometimes accomplishing in a week what once took years. For CHAPS-D, they identified Scalmalloy, a high-strength aluminum alloy, as the ideal material in just six months.

These advancements have positioned APL as a leader in additive manufacturing. The team’s innovative work on CHAPS-D earned recognition from the Minerals, Metals and Materials Society with two awards in 2024, underscoring APL’s role in advancing material science and space technology.

Continuing to push the bounds of innovative manufacturing methods, APL researchers leveraged cutting-edge techniques and shape memory alloys to create an antenna that can change its shape based on its temperature — a technology with transformative potential in a wide range of military, scientific and commercial applications.

APL engineers developed a shape-morphing antenna using a 3D-printed nitinol alloy that transforms from a flat spiral disk to a cone when heated. Inspired by adaptive technologies, this breakthrough could lead to revolutionary applications in fields from space exploration to telecommunications by providing flexible, self-deploying structures without external mechanisms.

Electrical engineer Jennifer Hollenbeck said she got the idea from “The Expanse” television series, where alien technology is organic and shape-changing. “I have spent my career working with antennas and wrestling with the constraints imposed by their fixed shape,” she said. “I knew APL had the expertise to create something different.”

In 2019, she reached out to Storck, who led an internally funded project to create a promising methodology to additively manufacture shape memory alloys. These unique materials deform at lower temperatures but return to a “remembered” shape when heated.

Artist’s rendering of a conceptual shape-morphing component designed for deployable space structures. While not physically fabricated yet, the design illustrates how thermally activated joints enable sequential deployment without external motors — offering a glimpse into the future of adaptable, lightweight space technology.

As scientists and engineers worked on new applications for nitinol — a shape memory alloy of nickel and titanium that had been used at APL to create coils that would extend down through a person’s esophagus to assist with heart imaging — a desire arose to 3D-print complex shapes with it. But that presented a problem: Nitinol and other shape memory alloys conventionally require extensive mechanical processing to achieve the shape memory effect, and so they are typically available only as wire or in thin sheets, defeating the purpose of printing complex shapes.

The APL team tackled the fundamental challenges associated with scalable additive manufacturing of nitinol components. After extensive experimentation toward the antenna application, the team altered the ratio of nickel and titanium, but the first attempt fell short.

Undeterred, Hollenbeck and team submitted a proposal for a Propulsion Grant — an internal funding program that supports development of revolutionary solutions to critical challenges — with a new antenna design. APL researchers had been able to 3D-print nitinol using a method in which the alloy is heated and cooled to alternate between two “remembered” shapes. With it, the team developed an antenna that was shaped like a flat spiral disk when cool and became a cone spiral when heated.

It, too, was not without its challenges, including avoiding interference with the antenna’s radio-frequency properties. To solve this problem, radio frequency and microwave design engineer Michael Sherburne helped the team decouple the AC/DC power with a novel power line that could handle the amount of current needed for the antenna to heat quickly while also not interfering with the radio-frequency line running through it. Finally, in 2024, APL engineers addressed printing the antennas at speed and scale and worked out the necessary processing parameters. The team is now optimizing them to work on multiple types of machines and variations of material.

The result of this cross-Lab collaboration? A radically innovative technology that could have wide-ranging applications from special operations in the field, to mobile network telecommunications, to space missions exploring distant worlds.