FB pixel

Congress charts diverging paths on AI in FY 2026 defense bills

Congress charts diverging paths on AI in FY 2026 defense bills
 

As the Pentagon races to integrate AI across all operations of each of the services, Congress is laying down distinct, and at times competing, frameworks to guide that evolution.

The Senate and House versions of the Fiscal Year (FY) 2026 National Defense Authorization Act (NDAA) reflect two sharply contrasting approaches to the role of AI in national defense, cybersecurity, and procurement. Together, they paint a picture of a Congress that continues to wrestle with how best to govern artificial intelligence while simultaneously accelerating its deployment.

In the Senate Committee on Armed Services’ NDAA draft, AI is threaded through a wide lens of cyber deterrence, procurement reform, and “imposes protections on key military spectrum bands supporting AI-enabled and unmanned systems.”

The committee passed the bill and sent it to the full Senate for a vote.

The bill calls for a coordinated, “whole-of-government” strategy that links AI oversight with cybersecurity modernization, signaling an effort to ensure that military adoption of AI does not occur in isolation from broader national security priorities. This systemic approach places emphasis on the interconnectedness of technological infrastructure, placing AI governance in tandem with cyber resilience initiatives.

At the core of the Senate’s bill is a requirement that the Department of Defense (DOD) adopt a formal cybersecurity framework for AI procurement. This provision is meant to safeguard against vulnerabilities in the acquisition of AI technologies, especially as defense contractors increasingly rely on commercial and open-source AI tools. To ensure smaller vendors are not unduly burdened, the bill also calls for a report on how DOD’s Cybersecurity Maturity Model Certification (CMMC) requirements impact small businesses, which are often at the forefront of AI innovation but lack the compliance resources of larger firms.

Another critical element of the Senate’s proposal is the protection of military spectrum bandwidth for AI-enabled systems and unmanned platforms. The bill would bar modifications to certain spectrum allocations that are essential for these systems, effectively locking in the electromagnetic space necessary for autonomous weapons and surveillance tools to operate.

This provision signals a growing awareness within Congress that AI’s military potential is dependent not only on software and algorithms, but also on reliable digital infrastructure.

In addition to technology safeguards, the Senate draft incorporates acquisition reforms drawn from the FORGED Act, a legislative initiative aimed at modernizing defense procurement practices. These reforms expand the definition of “nontraditional” defense contractors to ease access for AI startups and smaller firms, encouraging faster onboarding of commercial innovations.

The Senate’s message is clear: if the U.S. wants to maintain its technological edge, it must reform the bureaucratic bottlenecks that prevent agile companies from contributing to the defense ecosystem.

The House’s NDAA draft, H.R. 3838 – the “Speed NDAA” – by contrast, proposes to drill into operational governance by requiring the Department of Defense to establish a department-wide policy specifically governing AI and cybersecurity. This mandate would codify how the military handles AI development, deployment, and oversight, something critics argue has been inconsistent or fragmented across the services.

One of the House bill’s most consequential AI provisions is its mandate for a Software Bill of Materials (SBOM) for all AI systems used by the military. An SBOM provides a comprehensive inventory of software components, dependencies, and open-source libraries used in an AI product.

By requiring SBOMs, lawmakers aim to bring transparency and accountability to the Pentagon’s growing reliance on third-party AI code, much of which may originate from foreign or unverifiable sources. The move echoes similar efforts in civilian cybersecurity where SBOMs are increasingly used to trace vulnerabilities and prevent supply chain infiltration.

Perhaps the most ambitious provision in the House bill is the authorization for DOD to develop multiple generative AI streams – with some reports suggesting “as many as twelve” – for a variety of initiatives from AI-assisted command decision-making to automated threat detection and simulation environments for war-gaming scenarios.

While details are sparse, the scope of the language suggests the House envisions a wide-scale commitment to generative AI, including models similar to large language systems that have seen explosive growth in the commercial sector.

Security concerns are also front and center in the House version, which introduces enhanced cybersecurity requirements specifically tailored to AI applications. Although the public version of the bill summary does not detail these guardrails, members of the House Committee on Armed Services have made clear that they intend to impose stricter standards around model training data, algorithm auditing, and post-deployment monitoring.

These are all efforts that reflect broader concerns that AI systems, once operationalized, may exhibit unexpected behaviors or become vulnerable to adversarial attacks.

Although both chambers agree that AI must play a central role in future U.S. military strategy, the divergence in how they propose to get there is striking. The Senate’s approach prioritizes integration into broader national security and acquisition strategies, recognizing the systemic risks posed by unsecured or uncoordinated deployment, while the House favors a more prescriptive and operationally focused strategy, one that centers on internal governance, transparency, and definable AI initiatives.

The differences extend to how Congress views innovation itself. The Senate’s inclusion of the FORGED Act suggests confidence in commercial AI vendors and a desire to reduce friction in bringing their tools to the Pentagon. By expanding pathways for “nontraditional” firms to engage with DOD, the Senate hopes to break the cycle of overreliance on legacy defense contractors. The House, while not opposed to commercial innovation, is more cautious, insisting that oversight and transparency precede rapid adoption.

Timing may play a critical role in shaping the final contours of AI governance in the 2026 NDAA. The House Armed Services Committee is expected to begin formal markup of H.R. 3838 this week. Once each chamber finalizes its respective bill, a conference committee will reconcile the differences.

What ultimately emerges will have profound consequences for the trajectory of AI in the U.S. military. Should the Senate’s systemic safeguards be adopted, the Pentagon could see a more coherent and resilient AI ecosystem, but at risk of slower implementation.

If the House’s emphasis on transparency and predefined programs prevails, AI deployments could be more targeted and measurable, though potentially siloed and less adaptable to broader technological shifts. Either way, the 2026 NDAA stands to be one of the most consequential defense bills in recent memory when it comes to AI.

Related Posts

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Ping’s Rusbridge says the agent economy needs zero-trust evolution, not revolution

The agent economy is replacing the digital economy the world is still adjusting to, Ping Identity Group Product Manager Adam…

 

Entrust face biometrics show major gains in NIST FRTE

A face biometrics algorithm submitted by Entrust to the NIST Face Recognition Technology Evaluation (FRTE) 1:1 Verification has made significant…

 

Use of digital ID to access essential services jumps in Ireland

Digital ID is now commonplace in Ireland, becoming mainstream among the Irish public who are turning to digital services more…

 

Malaysia targets 17 million MyDigital IDs by end-2026

Malaysia is ramping up targets for its national digital ID system. The country aims to reach 17 million MyDigital ID…

 

Germany debates social media ban for under-16s

Social media age restrictions are getting traction in more and more jurisdictions. The latest to consider the move is an…

 

What could make the EU Digital Identity Wallets fail?

The EU Digital Identity Wallet has enormous potential, but its success cannot be taken for granted. Insufficient ecosystem buy-in, unclear…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events