Rapidly Patching Legacy Software Vulnerabilities in Mission-Critical Systems
October 16, 2019 | DARPAEstimated reading time: 4 minutes
There are a vast number of diverse computing devices used to run the critical infrastructure our national security depends on—from transportation systems to electric grids to industrial equipment. Much like commercial or personal computing devices, these systems utilize embedded software to execute and manage their operations. To fix certain security vulnerabilities, commercial and personal devices must undergo frequent updates, and are replaced every few years—or on occasion, more frequently when an update fails. Mission-critical systems are built to last for decades, and rarely have the same short upgrade cycles. These systems are expensive to develop and hard to replace, and as they become increasingly connected for the purposes of maintenance diagnostics and data collection, this proliferation of connected software is opening them to compromise. While the amount of deployed vulnerable software is growing exponentially, the effective means of addressing known vulnerabilities at scale are limited.
“Patching vulnerabilities in legacy software used by mission-critical systems is a challenge that is only growing in importance and complexity,” said Dr. Sergey Bratus, a program manager in DARPA’s Information Innovation Office (I2O). “Even after a particular flaw is fully understood, and a remediation approach has been developed and expressed as a source code change in the software, a vendor’s ability to produce patches for all of their deployed devices in a timely, assuredly safe, and scalable manner is limited. This results in mission-critical software going unpatched for months to years, increasing the opportunity for attackers.”
Today, identifying and remediating software vulnerabilities in legacy binaries requires highly skilled software engineers who are able to make expert assumptions based on what source code samples and/or limited knowledge of the original development environment may be available. The engineers are responsible for understanding the structure of the binary, developing and applying a patch by hand, and then manually analyzing and testing the binary to ensure it works properly. The process is arduous and time consuming with minimal assurances that the system will continue working as intended after the fix is applied. Further, this approach is becoming increasingly untenable as the amount of deployed software continues to grow within mission-critical systems.
The Assured Micropatching (AMP) program seeks to address these challenges and accelerate the process of patching legacy binaries in mission-critical systems and infrastructure. AMP aims to develop tools and methodologies for analyzing, modifying, and fixing legacy software in binary form with the assistance of assured, targeted “micropatches.” Micropatches are small patches that change the binary as little as possible in order to achieve an intended objective while also minimizing the potential side effects of the fix. AMP aims to create breakthrough technologies to reason about these small software fixes and, perhaps most importantly, provide proofs to assure that the system’s original baseline functionality is not lost or altered by the fix.
“Think of how many times you have updated software on your personal device and the update inadvertently caused some of the software to stop working, or worse, “bricked” the device. With current patching approaches, we are not given the assurance that the system will continue working as intended after the fix is applied. Assured Micropatching aims to create and apply fixes in an automated and assured way, giving us a means to expedite the time to test and deploy the patched system from months and years to just days,” said Bratus.
To enable the creation and rapid implementation of assured micropatches, the AMP program will explore novel breakthroughs in binary decompilation and analysis, compiler techniques, and program verification. Today, engineers utilize software decompilers to understand the executable binary, which is a key step in the process of patching legacy software. While helpful, today’s decompilers are largely heuristic and only able to generate a “best guess” at what the original source code may have been like. AMP seeks to develop goal-driven decompilation, which would use existing source code samples, any available knowledge of the original build process, and other historic software artifacts to improve decompilation and direct it towards a specific goal, such as situating a known source code patch. By being able to guide decompilation, an engineer developing a binary micropatch is better able to translate knowledge of flaws from the source code to the binary, accelerating the identification, analysis, and repair of vulnerabilities in the binary.
In addition to goal-driven decompilation, AMP aims to develop “recompilers” that compile the desired source-level change against the existing binary and provide assurances that the intended functionality of the software is maintained after it is patched. Today, it is difficult to analyze changes in the binary as compilers take a clean-sheet approach—throwing out the existing binary and starting from scratch with each analysis. The AMP program will work to develop recompilers that preserve the binary as much as possible when the patch is applied and analyzed. Once a fix is applied, the novel recompilers will analyze the effects to ensure it does not disrupt the baseline functionality of the software.
To ensure the tools and techniques in development work as intended, AMP will run a number of challenges throughout the life of the program. The challenges will explore various cyber-physical mission-critical system use cases, and assess how effective the technologies are at rapidly patching legacy systems.
Suggested Items
Lockheed Martin Successfully Transitions Long Range Discrimination Radar To The Missile Defense Agency
04/23/2024 | Lockheed MartinThe Long Range Discrimination Radar (LRDR) at Clear Space Force Station in Clear, Alaska, completed DD250 final acceptance and was officially handed over to the Missile Defense Agency in preparation for an Operational Capability Baseline (OCB) decision and final transition to the Warfighter. In addition, prior to this transition, the system has started Space Domain Awareness data collects for the United States Space Force.
Real Time with... IPC APEX EXPO 2024: AI Implementation at Omron
04/18/2024 | Real Time with...IPC APEX EXPOEditor Nolan Johnson and Omron Product Manager Nick Fieldhouse discuss the company's focus on AI implementation to enhance customer experience and results. They address programming challenges and how AI can help customers achieve better outcomes with less experience. Omron's AI is compatible with existing systems, facilitating easy upgrades.
Cadence Unveils Palladium Z3 and Protium X3 Systems
04/18/2024 | Cadence Design SystemsThe Palladium Z3 and Protium X3 systems offer increased capacity, and scale from job sizes of 16 million gates up to 48 billion gates, so the largest SoCs can be tested as a whole rather than just partial models, ensuring proper functionality and performance.
Australian Flow Batteries and The SCHMID Group Announce Groundbreaking Memorandum of Understanding
04/17/2024 | SCHMID GroupAustralian Flow Batteries Pty Ltd (AFB), a leader in innovative energy solutions and economical, safe, and reliable power storage, and SCHMID Energy Systems GmbH a company of the German SCHMID Group, a global technology leader with a rich history in delivering innovative solutions across multiple industries including Electronics, Renewables, and Energy Storage sectors, are thrilled to announce the signing of a Memorandum of Understanding (MoU)
Ansys Joins BAE Systems’ Mission Advantage Program to Advance Digital Engineering Across US Department of Defense
04/16/2024 | ANSYSAnsys announced it is working with BAE Systems, Inc., to accelerate the adoption of digital engineering and MBSE across the Department of Defense (DoD).