AI and Veterans: Tackling Suicide with Data-Driven Initiatives

The VA is leveraging AI to tackle veteran suicide, but a new proposal seeks to widen the net. Rep. Mackenzie aims to extend support to external organizations developing predictive models.
The Department of Veterans Affairs (VA) is ramping up its use of artificial intelligence (AI) to identify veterans at risk of suicide. But is that enough? Rep. Ryan Mackenzie of Pennsylvania thinks not. He's proposing a new approach, extending the reach of AI-driven suicide prevention beyond the VA's walls.
A Data-Driven Approach
The proposed Data Driven Suicide Prevention and Outreach Act seeks to bolster efforts by offering grants for predictive model development. The VA's own 2025 AI use case inventory showcases 367 examples of AI adoption. This includes five dedicated to identifying suicidal tendencies among veterans. Mackenzie wants to expand this scope, offering grant opportunities to nonprofits, academic institutions, and research organizations.
Mackenzie's proposal doesn't just ride on existing efforts like the REACH VET program. Launched in 2017, this program uses a predictive model to identify the top 0.1% of veterans at suicide risk. The VA updated the model last year to include factors like military sexual trauma, but Mackenzie argues for broader collaboration.
What's at Stake?
VA reports show that 6,398 veterans died by suicide in 2023, slightly lower than previous years. Yet, a staggering 61% of these veterans hadn't accessed VA healthcare services in the year before their deaths. This statistic spotlights an urgent need for wider-reaching solutions.
Grants under Mackenzie's proposal would target organizations with proven expertise in AI and healthcare. Only one grant would be awarded per VISN, although the VA plans to consolidate these networks. The program would sunset by September 2029, but its impact could be significant.
Mixed Reactions and Future Directions
The proposal hasn't been universally welcomed. While the Wounded Warrior Project supports it, citing the potential for improved outcomes, the Veterans of Foreign Wars opposes it. They argue that it duplicates existing efforts and distracts from enhancing current programs.
Why should this matter? Mackenzie views AI as a 'force multiplier,' not a replacement for human clinicians. This initiative could augment human-led care by spotlighting risk factors that might otherwise go unnoticed. But, can external organizations effectively collaborate with the VA to address this critical issue?
As Mackenzie refines his proposal with stakeholders, the potential for AI in veteran care remains promising. The key question: Will this expanded approach succeed where others have faltered, or will it simply scatter resources too thinly?
Get AI news in your inbox
Daily digest of what matters in AI.