Skip to main content
Agency Deep Dives

Air Force Software Innovation Trends in 2026

Reading the publicly published Department of the Air Force open-innovation SBIR solicitations of 2026 to understand the software-shaped problems the agency is making room for, without speculating about any particular submission.

Open-Literature Reading Everything below comes from peer-reviewed papers, the publicly published BAA, and open agency documents. Internal Precision Federal solution content, proposal text, and any program-office communications are off-limits for public articles in active program spaces, and none appears here.
Recurring Software-Shaped Problem Categories in 2026 (0–100)
Data engineering for legacy systems
90%
Operator-facing decision support
84%
AI-assisted maintenance and sustainment
76%
Software-acquisition modernization tooling
70%
Customer-discovery quality at submission
54%
Phase II-to-Phase III transition rates
38%

Higher score = the pattern is more strongly evidenced in the public 2026 DAF record.

The open-innovation mechanism

The Department of the Air Force runs an open-innovation SBIR pathway in which offerors identify both the operational problem and the customer who has it, and propose solutions accordingly. The publicly available BAA describes the structure: the offeror finds the customer, the offeror identifies the problem, the offeror proposes the technology. The open-innovation pathway is one of the most flexible in DoD SBIR and consequently one of the most competitive.

WHAT 2026 DAF SOFTWARE INNOVATION LOOKS LIKE PUBLICLY

Customer discovery is the central methodology. Successful Phase II performers cited identifiable customers by name and role; generic capability pitches without an appropriation path do not transition.

Reading the 2026 release

The 2026 DAF open-innovation releases — visible on dodsbirsttr.mil — include several software-shaped problem categories that recur across the customer set: data engineering for legacy systems, operator-facing decision support for complex tasks, AI-assisted maintenance and sustainment, and tooling for software-acquisition modernization. Each of these has a substantial public literature and a recognizable transition pattern through Air Force program offices.

The data-engineering category aligns with the unclassified architecture publications from the DAF Chief Data and Artificial Intelligence Office (CDAIO) and the broader Joint All-Domain Command and Control (CJADC2) program documents. The operator-facing decision-support category aligns with the human-systems integration literature from AFRL's 711th Human Performance Wing and the published doctrine on cognitive workload in time-pressured tasks. The maintenance and sustainment category draws on the well-developed prognostics-and-health-management (PHM) literature and the public documentation from Air Force Materiel Command's logistics enterprise.

The acquisition-modernization category is the youngest of the four and the most policy-driven. Public DAF software policy — Software Acquisition Pathway, BRAVO and Kessel Run-style platform-team experimentation, the published Software Modernization Strategy — defines the problem space in which acquisition tooling solicitations live. Offerors who read the policy literature alongside the technical literature land closer to what reviewers expect.

What "software-first" means here

Open-program solicitations attract a wide mix of hardware, software, and service offerings. The software-first subset has a different proposal grammar: smaller capital expense, faster prototype cycles, and a different transition story (a pilot deployment in a specific operational context rather than a hardware delivery). The published patterns from successful software-first AFWERX performers — Kessel Run alumni, Platform One adjacencies, several explicitly-software-only firms in the public Phase III record — describe a recognizable shape: continuous deployment as the prototype delivery model, observability as a primary engineering deliverable, and operator co-development from week one.

Reviewers familiar with the open-innovation pathway distinguish between offerors whose Phase I plan is genuinely software (build, instrument, test, iterate) and offerors who use software vocabulary to describe an integration project. The cleanest signal of a software-first plan is the Phase I deliverable list: working code with a documented test plan and an operator-facing demo, not a study report or a future-look architecture diagram.

Public Air Force guidance, including the DAF CIO's published statements on software factories and the Air Force Software Acquisition Pathway documents, repeatedly emphasizes that software is delivered through ongoing capability releases, not one-time deliveries. Phase I plans that mirror this posture — even at small scale — are read as more credible than plans that propose a single end-of-period deliverable.

The SBIR is not a research grant; the open-innovation mechanism is least forgiving of treating it as one.

Public software-priority patterns

  • Data engineering — Legacy schema integration, governance, observability for ML readiness.
  • Decision support — Operator-facing tools for complex tasks with calibrated confidence.
  • AI-assisted maintenance — Sustainment workflows aided by automated diagnostics.
  • Software-acquisition tooling — Modernizing the way software is procured and modernized.
  • Customer discovery — Steve Blank lineage — finding customer pain before proposing technology.

Customer discovery as a research method

The publicly stated AFWERX guidance treats customer discovery as the central methodology of an open-innovation Phase I. The point is not to confirm the offeror's existing solution but to find out whether the offeror's problem statement matches a real operator pain point. The Steve Blank lineage of customer-development methodology, adapted by AFWERX into the published "Customer Discovery" framework, gives offerors a structured way to do this — and reviewers use the same framework as their evaluation rubric.

Several published case studies from successful AFWERX performers describe pivoting their proposed approach mid-Phase I based on customer interviews. The pivot pattern is recognizable in the public record: the offeror enters Phase I with a hypothesis, conducts a documented set of customer-discovery interviews, finds the actual pain point is adjacent to the hypothesis, and rewrites the Phase II SOW to address the actual pain point. The pivot is not a Phase I failure mode; it is the Phase I success mode.

Offerors who treat customer discovery as a checklist item, rather than a research method, perform worse on Phase II and Phase III evaluation. The published guidance is specific: minimum interview counts, structured note-taking, evidence preservation, and synthesis into a problem statement that the customer themselves will validate. Phase II SOWs that cite specific operator quotes from customer-discovery interviews are taken more seriously than SOWs that summarize "operator feedback" in the offeror's own voice.

Transition realities

Phase I-to-Phase II transitions for DAF open programs are tracked in public Air Force SBIR reporting and are sobering at the topline; the Phase I-to-Phase II transition rate for the open-innovation pathway has been published in roughly the high-single-digit to low-double-digit percentage range across recent cycles, depending on cohort definition. The exact rate moves cycle to cycle and the public reporting clarifies the methodology.

Successful transitions tend to share a structure: the Phase I customer was identifiable by name and rank; the operator demand was specific; and the Phase II SOW described a measurable extension of capability rather than a research extension. Phase II-to-Phase III transitions follow a similar logic at higher dollar amounts, often with a non-SBIR appropriations account named in the proposal — STRATFI and TACFI being the AFWERX-administered vehicles that several published Phase III case studies cite.

The non-SBIR appropriations account question is the most overlooked part of the transition story. The published Air Force Phase III case studies show that the most durable transitions are those where the operator's command has identified its own funding line for sustainment, not those where the offeror is hoping for a follow-on SBIR. Offerors who can name the appropriation account, the program element, and the responsible program office in their Phase II proposal are positioned for transition; offerors who cannot, are not.

PhaseWhat reviewers rewardWhat they discount
Phase IDocumented customer discovery; named operator; software deliverablesFuture-look architecture diagrams; generic "operator feedback"
Phase IIMeasurable capability extension; operator co-development planOpen-ended research extensions; vague evaluation criteria
Phase IIINamed non-SBIR appropriation; responsible program office identifiedHopes for another SBIR; unspecified sustainment path

Posture for offerors

The honest posture for a software-first offeror engaging the open-innovation pathway is to do the customer-discovery work before submitting, to write the problem statement in the customer's language, and to make the Phase I deliverables specific enough that the customer can evaluate them. The published AFWERX coaching materials emphasize this repeatedly; the reviewers internalize it.

The second posture point is to align the proposed software to the published Air Force reference architectures rather than to propose against an idealized greenfield. The DAF Software Acquisition Pathway, the CDAIO data publications, and Platform One's published reference patterns all give offerors specific conventions to align to. Offerors who name the alignment explicitly are read as more credible than offerors who propose a freestanding stack.

The SBIR is not a research grant; the open-innovation mechanism is least forgiving of treating it as one. The successful offerors in the public record have treated each phase as a deliverables-driven contract with a specific operator audience, not as exploratory funding.

Concept terms in this problem class

Open-program vehicle. A solicitation pathway in which the offeror identifies both the customer and the problem, with the agency reviewing the fit rather than naming the technical scope.

Customer discovery. The methodology of validating that a problem statement matches a real operator pain point, treated by AFWERX guidance as the central activity of an open-innovation Phase I.

Transition story. The plausible path from a Phase II prototype to a non-SBIR appropriations account in a named program office.

Common questions on the public-record framing

What public reference architectures inform DAF software?

DAF Chief Data and AI Office publications, CJADC2 lineage, AFRL 711th HPW, AFMC PHM, the DAF Software Acquisition Pathway, and the Software Modernization Strategy.

Why does customer discovery dominate Phase I posture?

The Steve Blank lineage of customer development — adapted for defense — is the published guidance. Successful Phase I performers describe pivots based on customer interviews.

Where does Phase III transition come from?

STRATFI/TACFI, named appropriation accounts, and direct sponsor commitment. Generic capability pitches without an appropriation path do not transition.

What does this article not cover?

Specific named offerors, specific customer offices in active engagement, or any Precision Federal positioning.

Frequently asked questions

What is an Air Force open-innovation SBIR?

A solicitation pathway in which the offeror identifies both the operational problem and the customer who has it, then proposes the technology. The structure is publicly documented in the BAA.

How is "software-first" different from a software-mentioning hardware proposal?

Software-first means the Phase I plan is genuinely software work — build, instrument, test, iterate — rather than a hardware integration project that uses software vocabulary. Reviewers experienced with open programs distinguish the two.

Why is customer discovery emphasized so heavily?

Because the open-innovation pathway is least forgiving of offerors who confirm their own existing solutions rather than validate a real operator pain point. AFWERX guidance treats customer discovery as the central methodology, not a checklist item.

What does a transition-credible Phase II SOW look like?

It names the Phase I customer by name and rank, describes a specific operator demand, and proposes a measurable extension of capability rather than a research extension — frequently with a non-SBIR appropriations account named for Phase III.

Why this work matters to us

Precision Federal is a software-only SBIR firm. The reason articles like this one exist on this site is simple: federal program offices fund teams whose principal investigators have demonstrated, in public, that they think carefully about the problems the program is trying to solve. We write to demonstrate that posture, not to telegraph any particular technical approach. If your office is exploring the problem class above and wants a partner who reads the literature, codes the prototypes, and ships under a Phase I or Direct-to-Phase-II SOW, we are listening.

1 business day response

Pursuing an Air Force open innovation?

If your office or your team is engaging the AFWERX open-innovation pathway and wants a software-first PI as a partner, we welcome the introduction.

SBIR partneringMore insights →Start a conversation
UEI Y2JVCZXT9HP5CAGE 1AYQ0NAICS 541512SAM.GOV ACTIVE