Paper Summary: A Research Agenda for Flight Software Security
This post summarizes my 2023 IEEE SMC-IT paper "A Research Agenda for Space Flight Software Security" co-authored with Gregory Falco.
When I started my dissertation research, I went looking for the academic literature on flight software security. I found policy papers about space being a contested domain, a growing body of work on ground segment security, and almost nothing about the software actually running on spacecraft. That gap motivated this paper.
The goal wasn't to propose solutions. It was to systematically lay out what needed to be studied, so that researchers from adjacent fields--security, formal methods, embedded systems--would have a map of the territory and a reason to show up.
The Core Argument
Flight software occupies a strange position in the security world. It's the most critical software on a spacecraft, controlling everything from command and data handling to guidance, navigation, and attitude control. Compromise the flight software and you own the vehicle. And yet, the community that builds it has focused almost exclusively on quality and fault tolerance--protecting against random failures, not intelligent adversaries.
The distinction matters. Fault tolerance assumes failures are probabilistic and predictable. An adversary is neither. An attacker actively probes, adapts, and stresses the system in ways that environmental faults never will. The layers of redundancy that protect against a bit flip from a cosmic ray do nothing against someone who understands the command protocol.
The industry has operated under security-through-obscurity for decades, but that era ended when NASA open-sourced cFS and JPL released F Prime. Adversaries and researchers now have the same access to production-quality flight software architectures. That's a good thing--if researchers use it.
Twelve Research Directions
The paper's main contribution is a structured research agenda with twelve specific items, organized around the design considerations we identified from reviewing current flight software practice. A few that I think are especially important:
Secure-by-design flight software architecture (Agenda Item L). This is the capstone item and the one closest to my own ongoing work. The argument is that incremental patching of existing C-based flight software stacks won't get us where we need to be. We need architectures built from the ground up with security as a design constraint--formal verification integral to the design, memory-safe programming languages throughout, isolation between components enforced at the OS level. The DARPA HACMS program proved this was possible for aircraft systems using seL4 and verified application code. Nobody had done the equivalent for spacecraft.
Understanding the full attack surface (Agenda Item I). Flight software is a layered system: CPU microcode at the bottom, kernel services and device drivers in the middle, mission applications on top, all communicating through interfaces that were designed for functionality, not security. We proposed a methodology that starts from the architectural decomposition and systematically characterizes every interface where two software components communicate--every seam where misuse could produce undefined behavior. This is fundamentally different from the ad hoc vulnerability hunting that characterizes most flight software security analysis today.
Programming language selection (covered in Design Considerations, Section III-F). We made the case that the C language carries fundamental security risks that coding standards and static analysis can only partially mitigate. The paper surveys alternatives beyond Rust, including Ada SPARK, D, Nim, and Ivory, and discusses advanced type theory work like dependent types and linear types that could further restrict unsafe behavior. The NSA had just released guidance recommending memory-safe languages, and the aerospace industry wasn't paying attention.
The Frameworks
Two frameworks anchor the paper's approach to the problem. NIST SP 800-160 Volume 2 defines cyber resilience as the ability to anticipate, withstand, recover from, and adapt to adverse conditions--including deliberate attack. That document gives us resilience objectives. Bailey's four principles for space cyber resilience--robustness, opacity, constraint, and responsiveness--translate those objectives into something applicable to flight software design.
On the threat side, SPARTA (Space Attack Research and Tactic Analysis) was the best available framework for cataloging adversary tactics and techniques against space systems. We used it to ground the agenda in real attack patterns, but noted it was still in its early stages and needed significant expansion and validation.
The paper was deliberately an invitation. Several of the twelve agenda items became the basis for our subsequent work--the attack surface analysis of cFS, the investigation of secure programming languages, the Alcyone secure flight software architecture.