By Julie Galante

Alvaro Morales MPA 2024 is building a way for doctors to automate basic tasks so they can focus on making deeper connection with their patients.

The summer before starting the Master in Public Administration Program, Alvaro Morales MPA 2024 was home in Bogotá, Colombia, when his father’s legs began to swell. He brought him to the doctor where his father was diagnosed with varicose veins, but one of Morales’ sisters, an anesthesiologist with training in cardiology, deduced it was heart failure. They rushed him to the hospital, where he had open-heart surgery to repair his aortic valve.    

“The first doctor gave a bad diagnosis,” he says. “If it weren’t for my sister, our dad would be dead.”  

His family’s traumatic experience inspired Morales to learn about the systemic underlying causes of misdiagnosis and misprescription in the healthcare system and try to help solve for them by drawing from his professional background in health regulation and technology. 

Morales dove in his first semester at Harvard Kennedy School from the start.  

He took the late former U.S. Secretary of Defense Ash Carter’s renowned class on innovation and technology—“the gold standard at HKS,” he says—and artificial intelligence (AI) electives at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) to learn how to code and develop AI applications. He started co-chairing the AI and Tech Policy student organization that focuses on AI and AI regulation. And this spring, he’s a teaching assistant for DPI-681M: The Science and Implications of Generative AI, co-taught by Lecturer in Public Policy Teddy Svoronos.  

ChatGPT, the chat generative pre-trained transformer developed by OpenAI, was starting to make headlines at the end of Morales’ first semester. One of his AI classes was discussing how to translate ChatGPT-generated research into a practical application for  medical professionals when Morales had an idea: what if  an application could take a patient’s entire clinical history to predict a diagnosis and suggest the associated prescriptions?  

Morales enrolled in a machine learning operations course at SEAS, where he worked with a team of classmates to build a prototype and bring the idea to life. They called the application PrescrAIbe.co. 

“We’ve been developing PrescrAIbe.co this past year to help doctors with prescriptions,” Morales explains. “The application takes the patient’s medical history and uses LLMs [large language models] to predict a diagnosis and suggest the prescription.” 

Currently, primary care physicians spend up to 50 percent of their workday on administrative tasks such as searching for International Classification of Diseases (ICD-10) codes. And critically, up to 50 percent of antibiotics are inappropriately prescribed, causing antibiotic resistance. Misprescription in patients, Morales notes, impacts health systems and health system expenditures. 

By automating certain menial tasks, doctors could dedicate more time with their patients to collecting nuanced information, taking the AI-generated information into account, and then ultimately making a diagnosis. 

“I see AI as a tool to augment human capacity, not to replace it,” Morales says. “By automating tedious tasks, humans can focus on developing deeper interaction and connection—it’s our great advantage over artificial intelligence.” 

Man with dark hair wearing a white button-down shirt and navy blue blazer
“I see AI as a tool to augment human capacity, not to replace it. By automating tedious tasks, humans can focus on developing deeper interaction and connection—it’s our great advantage over artificial intelligence.” 
Alvaro Morales MPA 2024

Morales and his team are testing and fine-tuning the PrescrAIbe.co model. It’s also under review at the Center for Medicines, Information, and Power, a think tank at the National University of Colombia (Universidad Nacional de Colombia) where Morales used to work. He is also in conversation with health insurers in Colombia about PrescrAIbe.co—two are interested in running pilots.  

“Healthcare is a very risky sector,” he stresses. “We have to be very sure we have a well-functioning tool that is ethically responsible, useful, and it works.” 

Morales is well acquainted with the inner workings of the healthcare sector.  

Before coming to HKS, he worked for the Inter-American Development Bank (IDB) on health regulation, focusing on pharmaceutical price regulation at the Ministry of Health and Social Protection in Colombia.  

Morales also worked as an IDB economic consultant in the Dominican Republic to help build the country’s pharmaceutical policy, but eventually returned to Colombia to pivot to technology—something he’s been interested in since he was a kid—where he was a chief economist at the Colombia Chamber of Electronic Commerce (Cámara Colombiana de Comercio Electrónico, CCE). There he worked on AI regulation and AI ethics, as well as with the Center for Medicines, Information, and Power at Universidad Nacional de Colombia, where he came across a study about elderly patients taking more than five medications; it focused on how to remove some of the medications to improve patient outcomes. 

“While I was working in government, I saw pharmaceutical companies have a huge incentive to promote medications that aren’t in the best interest of the health system,” Morales explains. “They would promote new, very expensive technologies, but patients may have the same outcome using older medicines that do not have IP [intellectual property] protection.” 

 

Takeaways from his HKS experience

As an HKS student, Morales has been leveraging the Harvard Innovation Labs (iLab) to develop a product mindset and think about PrescrAIbe.co’s business structure. As an example, he cited a recent iLab workshop that focused on understanding basic terminology in healthcare AI-based technology,  electronic records, and regulations.  

“The iLab has also been really useful about how to reach clients and other skills I was not very well acquainted with—how to have a product mindset and think about the business structure,” he says.  

Morales says his overall HKS experience informed his understanding about AI in two ways. 

“PrescrAIbe.co is a social impact project,” he explains. “When I worked in the government and in healthcare policy, I realized there are limits to what governments can do. There is a huge opportunity to use AI tools outside formal policy to create change. The pharmaceutical industry’s behaviors can have a negative effect on the healthcare system particularly. I think AI tools can help to promote financial sustainability for health systems and, at the same time, support how doctors are doing their work.” 

“When I worked in the government and in healthcare policy, I realized there are limits to what governments can do. There is a huge opportunity to use AI tools outside formal policy to create change.” 

Morales also continues to learn about new developments in AI policy while at HKS—something he and his classmates discuss in the AI and Tech Policy student organization. Recently, he’s been learning about European regulation of AI and its risks, with healthcare being the riskiest area.  

“This has incentivized me to stop and really think about the impact of PrescrAIbe.co, how it can affect patients and doctors, and to slow things down to avoid the ‘move fast and break things’ mindset. It doesn’t apply to the healthcare industry where you have to be very mindful of what you are doing and the impact you’re going to have,” he says. “My HKS experience helped me to learn about these ideas, about regulation, and how AI should be done properly.”  

“The discussion about how to regulate AI technology will only grow at HKS,” he adds. 

Read Next Post
View All Blog Posts