Defense Media Network

VA Research:
The National Artificial Intelligence Institute

The VA’s new institute aims to improve veterans’ lives through advanced analytics.

In the spring of 2020, as the nation’s COVID-19 pandemic continued to confound doctors’ efforts to predict patient outcomes, the Department of Veterans Affairs’ (VA) new National Artificial Intelligence Institute (NAII) stepped up to the plate: In collaboration with the Washington DC VA Medical Center, a predictive model was developed that would incorporate patient data from the department’s Corporate Data Warehouse (CDW) with demographic and other community data. This data, analyzed by artificial intelligence (AI) software using machine-learning algorithms, would predict prognostic risks for various outcomes, including whether or not an individual patient would succumb to the disease within 30 days.

The work showcased a new type of collaboration and worked through various policy and computational issues. The experimental tool consisted of two main elements: the mortality model itself, which performed better than existing models; and a COVID-19 dashboard for VA clinicians, which would display a patient’s risk score, along with the individual and environmental factors driving that risk.

The COVID-19 dashboard, piloted at the Washington DC VA Medical Center, is available to VA clinicians, offering two distinct views: Primary care teams can filter datasets by patient’s health care provider, track COVID-19 testing, and view the mortality risk scores – probabilities generated by the model. Inpatient care providers can filter datasets by hospital location and specialty.

https://sfdc.co/VAgov

Hospitalists at the DC VA Medical Center were all given access to the model to assess its potential for ensuring, prior to a patient’s discharge, that no hidden or unusual factors might affect the prognosis. Though still in the pilot phase, the new predictive model is already leading to good results: Clinicians tested the model on a COVID-19 patient with a kidney disorder. The hospitalist believed the model would overweight the preexisting conditions and predict mortality on a patient who was recovering – but this was not the case; the patient had a relatively low mortality risk score (0.31) when factors such as oxygen saturation and platelet levels were factored into the prediction.

 

THE VA’S NEW INSTITUTE

The COVID model and dashboard pilot at the DC VA Medical Center, along with other VA AI capacity-building efforts, demonstrate something that may not be widely known outside the VA health care system: AI is already being applied in many circumstances to improve the diagnosis and treatment of veteran patients.

It’s no wonder: AI works best when algorithms can process a lot of underlying, interrelated information, and the Veterans Health Administration operates the largest integrated health care system in the United States, serving more than 9 million patients at more than 1,200 medical facilities. Through its Million Veteran Program, it has collected and curated VA PHOTOgenomic data from about 800,000 donated blood samples, the largest such database in the nation.

James A. Haley Veterans’ Hospital pathologists Dr. Andrew Borkowski, left, and Dr. Stephen Mastorides examine tissuesample slides under a microscope. Borkowski and Mastorides are training a machine-learning module to differentiate between cancerous and healthy cells in images taken from specimen slides.

James A. Haley Veterans’ Hospital pathologists Dr. Andrew Borkowski, left, and Dr. Stephen Mastorides examine tissuesample slides under a microscope. Borkowski and Mastorides are training a machine-learning module to differentiate between cancerous and healthy cells in images taken from specimen slides. VA photo by Robert Turtil

VA medical records also contain more than a billion images from scans, X-rays, and other technologies – a rich potential resource for AI applications. Earlier this year, for example, Drs. Stephen Mastorides and Andrew Borkowski of the James A. Haley Veterans’ Hospital in Tampa, Florida, began training a machine-learning module to recognize the difference between cancerous and healthy cells in images taken from specimen slides. In their earliest studies, machine-learning software was able to diagnose cancer with accuracies above 95 percent.

The VA trains more doctors and nurses than any other health care entity in America, and R&D emerging among VA investigators and clinicians reveal the enormous potential of AI to transform research and care. The key to unleashing this potential will be the ability to focus and coordinate the explorations and uses of AI in order to apply a strategic, veteran-centered focus to VA research and clinical care.

To help achieve this focus, the NAII was formally established last fall – in November, National Veterans and Military Families Month. Gil Alterovitz, PhD, FACMI, FAMIA, who was one of the core writers of the updated “National AI R&D Strategic Plan” released in 2019, is leading the new institute. He is also a Harvard Medical School faculty member in the Department of Medicine and has led national and international initiatives in the innovative use of health care data and technology.

The institute is a joint effort between the VA’s Office of Research and Development and the Secretary’s Center for Strategic Partnerships; the NAII will combine the resources and expertise of the VA and partners in the public and private sector. It will look at partnerships, policy, pilots, and community engagement around AI. For example, VA collaborations have included a partnership between VA and DeepMind working on predicting the onset of deadly kidney disease. The institute is working across offices and industry to enable leveraging this and other models at the VA for AI R&D and education.

Less than a year old, the new institute is working on a strategy that begins, Alterovitz explained, with two primary components: “We want to make a difference for the veterans, their health and well-being,” he said. “So we’ll need to know the veterans’ priorities, and that’s one of the things we’re working to discover.” In order to use AI to meet those priorities, VA will bring different offices and a veteran engagement board together to determine priorities and pilot the use of these cutting-edge technologies.

“There are a number of offices that are working on different applications related to artificial intelligence,” Alterovitz said, “so we established an AI Task Force that is bringing together offices across the VA – and not just in research. That way, we can work together to combine and leverage some of these applications, and make a difference for the veterans directly.”

Earlier this year, the institute was cited as one of the early national AI successes in the first annual report of the American AI Initiative (link: https://www.whitehouse.gov/wp-con- tent/uploads/2020/02/American-AI- Initiative-One-Year-Annual-Report.pdf).

 

AI MEETING VETERAN NEEDS

The careful work of coordinating and strategizing is important to AI’s future in the VA, but the NAII is also actively working to encourage the exploration and implementation of AI to solve urgent problems.

888 High school students (from left) Ethan Ocasio, Neeyanth Kopparapu, and Shreeja Kikkisetti developed the Clinical Trial Selector, which won honorable mention at the NAII’s inaugural AI Tech Sprint.

The NAII has pioneered what it calls AI Tech Sprints: time-limited competitions, conducted in partnership with the Department of Commerce, designed to encourage collaboration with potential partners in the academic, industrial, and nonprofit sectors. The approach earned a Government Innovation Award for public-sector innovations in 2019. Teams that compete in AI Tech Sprints are charged with designing AI-enabled tools that address veteran needs, all while interacting with VA researchers.

More than 10 teams, most of them representing leading software and health care companies, participated in the institute’s inaugural AI Tech Sprint that showcased results in December 2019, an event that produced several promising innovations – one of which was developed by three high school students from Virginia. The students’ Clinical Trial Selector used natural language processing to draw patient information from records, applied AI to sort through characteristics of cancer patients – including age, gender, lab values, and types of cancer – and then ran that information through an interface with the National Cancer Institute to match eligible patients with clinical trials.

High school students (from left) Ethan Ocasio, Neeyanth Kopparapu, and Shreeja Kikkisetti developed the Clinical Trial Selector, which won honorable mention at the NAII’s inaugural AI Tech Sprint.

High school students (from left) Ethan Ocasio, Neeyanth Kopparapu, and Shreeja Kikkisetti developed the Clinical Trial Selector, which won honorable mention at the NAII’s inaugural AI Tech Sprint. VA photo

Today, matching veterans to clinical trials can be a tedious, frustrating process. This new tool, developed in a short period of time by high school students, could be adapted or scaled up to have a huge effect for both veteran patients and VA investigators. Alterovitz imagines a search engine that might allow veteran patients to log in and find – and sign up for – trials matching their medical information. “The VA researcher who is recruiting patients for a trial will automatically have another patient,” said Alterovitz. “And, most importantly, the patient benefits, because they get this experimental treatment – which they might not have been able to access otherwise – that could work for their condition.” The team later received production access to VA data for veterans to use the app, and feedback was used to enhance VA health information available to veterans.

The Clinical Trial Selector won honorable mention at the Tech Sprint, whose overall winner was a team from the digital consulting company Composite Apps. The application developed by this team, CURA Patient, is a web-based platform for coordinating care and making care plan details available to patients and their families; providers; specialists; and payers. The platform is a complex, multifunctional tool – for example, Alterovitz pointed out, it can apply machine learning to imagery and function as a virtual caretaker: A patient could take a picture of their pills for the day with a smartphone, send it to the cloud for processing, and receive immediate feedback. “It might tell you: ‘Oh, you’re missing this medication right now. You should be taking this twice a day, and you don’t have it in your hand.’”

In addition to optimizing care for individual patients, said Alterovitz, CURA Patient can be scaled up to steward the cost and efficacy of care throughout a health care system. “It has different phases for how it can be integrated into different systems,” he said, “and at each level, there are different pieces of AI that contribute to it.”

 

THE FUTURE OF AI IN VETERAN CARE

As it encourages novel interactions with organizations outside the VA, the NAII is working to build interdisciplinary partnerships in AI research and development. “We’re looking at doing different collaborations and leveraging different types of data that traditionally have not been used for veterans.” For example, an NAII team consisting of Alterovitz and Christos Makridis, PhD, collaborated with professor Cosmin Bejan, PhD, from Vanderbilt University, Stanford University computer scientist David Zhao, and the polling firm Gallup to examine machine learning ap-proaches that could quantify the role of socioeconomic factors on veterans’ physical well-being.

http://iwumilitary.com/

While some socioeconomic data can be found in VA medical records, it’s not typically a focus – but responses to survey questions, carefully crafted by Gallup, have the potential to add detail and discover connections that medical records alone may not be able to reveal. “This is thinking of novel ways to apply AI,” said Alterovitz, “using new processes and partnering approaches. It’s based on data gathered from outside the VA, but focuses on how that data, with AI, can help improve veterans’ well-being.”

As the NAII continues to gather information about how AI is being applied throughout the VA, a picture is emerging of what kind of AI is most needed by VA researchers. One of the NAII’s first tasks has been to define the challenges VA researchers and clinicians may encounter in developing and implementing AI R&D, and to identify key areas for advancing AI R&D at the VA. Specifically, the NAII has identified five key areas for cutting-edge AI research and development: 1) deep learning, 2) trustworthy AI, 3) privacypreserving AI, 4) explainable AI, and 5) multiscale AI analysis.

 

Deep Learning.

Deep learning is modeled after how our brains function. It utilizes artificial neural networks with specialized, multi-layered architectures, and can learn tasks by analyzing training examples. “It’s a technique that is especially useful when you have noisy data sets in related pieces of information,” said Alterovitz. “And that’s exactly what we see many times in the VA: You’ve got imaging. You’ve got language that is processed from clinical notes, which can include typos. Some of these are quite messy. They are noisy and large – the largest integrated health care system in the country, all running essentially the same overall software.”

The VA/DeepMind collaboration to predict the onset of deadly kidney disease – and to enable prophylactic treatment – is an example of deep learning at work.

Dr. Thomas Osborne is chief medical informatics officer at the VA Palo Alto Health Care System and leader of a trial there to evaluate the artificial intelligence model developed by the VA/DeepMind partnership to predict the onset of deadly kidney disease.

Dr. Thomas Osborne is chief medical informatics officer at the VA Palo Alto Health Care System and leader of a trial there to evaluate the artificial intelligence model developed by the VA/DeepMind partnership to predict the onset of deadly kidney disease. VA photo by Adan Pulido

 

Trustworthy AI.

Trustworthy AI is developed around the nation’s laws and values – which may seem a selfevident characteristic of AI, but some machine-learning algorithms, despite the intentions of their designers, can produce untrustworthy results. For example, last fall, a research group reported a study in Science that found a health insurance algorithm employed by major U.S. hospitals was biased to determine that Black patients were less likely to need care.

One important assertion of trustworthy AI is that the training data used to develop algorithms must be representative of the people the algorithm is designed to help. VA data, which represents the multitude of experiences, attitudes, and ethnic, geographic, and gender composition among veterans, will encourage researchers to train algorithms that are fair and equitable to all veterans.

 

Privacy-preserving AI.

The tremendous amount of veteran health data collected by the VA creates a challenge when that data is accessed by researchers and partners: Information is essential for training powerful AI algorithms to serve veterans, but VA data often contains sensitive health details, and it is important to obtain results without violating privacy by identifying individual patients. One option is a cutting-edge technique known as homomorphic encryption, which allows a program to perform analysis or calculations on encrypted health information without revealing any information about a patient’s identity.

 

Explainable AI.

Another priority, particularly for clinicians, is explainable AI: Physicians want to understand why a machine made a decision that will affect a patient’s health. Feeding data to an algorithm and getting an answer – for example, recommending a certain medication for treatment of a patient – without any insight into how that recommendation was reached is an experience known as the “black box” phenomenon. Many physicians – understandably – approach such results with skepticism.

The Prediction Modeling Unit at the Ann Arbor VA Center for Clinical Management Research (CCMR) encountered this problem when developing an AI model for predicting symptomatic flareups associated with inflammatory bowel disease (see sidebar, page 31). “Some of the statistical methods we’ve used in the past,” said Ann Arbor VA gastroenterologist Akbar Waljee, “are easily understood. We do confront some barriers or challenges with people adopting some of these machine-learning methodologies, because some are likely to say: ‘Well, you just threw data into a computer, and it came up with something. How do you know why it came up with that?’” Explainable AI, said Waljee, produces more than just a decision or recommendation; it presents users with a list of decision points during the processing of a given algorithm – an illustration of how it arrived at its conclusions.

Likewise, said Alterovitz, the new COVID-19 dashboard pilot makes the AI behind its model explainable to clinicians, who do not have to take its risk scores at face value. They can see all the inputs that were included in – and excluded from – the model, and how they were interrelated and considered, in order to decide whether they need to leverage its findings and/or seek more information to inform their treatment of a patient.

 

Multiscale AI Analysis.

A system with the size and scope of the VA, said Alterovitz, will need AI applications that can simultaneously analyze items at multiple scales: integrating deep learning models across several modes of medical imagery, for example, from the molecular level to gross anatomy. Multiscale analysis can also refer to time: using an observation or lab value from a given moment in time to predict trends over days, months, or even years.

“Integrating information from multiscale analysis can give you a better picture of a patient,” Alterovitz said. “It’s very useful for finding relationships – say, inputs from different specialists, such as radiologists and pathologists, who may not normally interact with each other that much.” AI also, he said, can be applied to VA care systemwide, to reveal outcomes in terms of quality, efficiency, and cost – a better picture of how the entire VA health care system is performing.

Wyndy Wiitala, PhD, who co-directs the CCMR’s Prediction Modeling Unit in Ann Arbor with Waljee, believes the NAII, with its focus on coordinating resources and expertise, could help unlock the vast potential of data contained within veterans’ medical records. “The VA has such rich data,” she said. “There are a lot of opportunities to use that data for different prediction models in different situations, to understand patient trajectories and to improve patient care. I think there is a lot of work to be done in that area, and I think it would be great to collaborate with others.”

Given its size, complexity, and emphasis on data-driven clinical decision-making, AI may become the most powerful tool available to VA researchers, clinicians, and administrators: the key to putting the VA’s unique resources to work to help veterans.

____________________________________________________________

 

USING AI TO PREDICT IBD FLARE-UPS

At the VA’s Ann Arbor Healthcare System in Michigan, gastroenterologist Akbar Waljee, MD, is building a better way to predict flare-ups in symptoms associated with inflammatory bowel disease (IBD).

More than a million Americans – including, according to a VA study, more than 60,000 to 80,000 veterans (2000-2019) – suffer from IBD, an umbrella term for chronic conditions that include Crohn’s disease and ulcerative colitis. According to the Crohn’s and Colitis Foundation, IBD-related hospitalizations and outpatient drug therapies cost between $11 billion and $28 billion annually.

Periods of symptom flares and remission are typical of IBD, and flare-ups are often painful and debilitating enough to require hospitalization, surgery, or treatment with steroids – which can involve side effects and increase the risk of other disorders, such as infections, bone loss, blood clots, and high blood pressure, among other side effects. In some cases, IBD can lead to lifethreatening complications such as blood clotting and liver damage.

inflammatory bowel disease (IBD)

Adobe stock image

Biomarkers that help predict IBD flares are most commonly identified through blood or stool tests, which are expensive and vary widely in availability and accuracy. Better predictive models, Waljee believes, would help patients avoid disabling aggravations of this disease and keep them in remission – and in remission, avoid long periods of ineffective or unnecessary therapies with other drugs. By keeping veterans out of hospitals, a better predictor could also greatly reduce IBD-associated health care costs.

Waljee is exceptionally capable of devising this new predictive tool: In addition to being a staff physician, he’s an investigator in the Ann Arbor VA’s Center for Clinical Management Research (CCMR). With research health scientist Wyndy Wiitala, PhD, he co-directs the CCMR’s Prediction Modeling Unit (PMU), which uses machine learning to collect and analyze patient data for the purpose of informing clinical decisions.

https://secure.sonosite.com/sonosite-px-military?elqCampaignId=12827&utm_sourceoriginal=adwords&utm_sourcemostRecent=adwords&utm_sourcepagetitle=sonosite%25px%25product

Using machine-learning algorithms to analyze patient record data, including histories of flares and commonly available bloodwork values, the PMU team, along with University of Michigan statistician Ji Zhu, PhD, came up with a set of criteria for deciding which patients to watch more closely and which may need to begin taking non-steroidal medications to forestall flares. Because some of these drugs take two to three months to take effect, Waljee said, it’s important to identify a coming flare at least three months in advance. “We decided to take all of the relevant information from a patient’s prior history, the longitudinal data, and then predict their need for steroids in the next few months,” he said.

The model produced by the Ann Arbor team – which is updated over time to integrate new patient data – has outperformed traditional tests, predicting flare-ups among veteran patients with about 80 percent accuracy. The next step, Waljee said, will be to validate the model in an external cohort of patients, and then to develop a platform for deploying the model throughout the VA.

 

Click here, to get your free print edition of 
Veterans Affairs & Military Medicine OUTLOOK
delivered to your home or office.

By

Craig Collins is a veteran freelance writer and a regular Faircount Media Group contributor who...