Protecting maternal health in Rwanda | MIT News
[ad_1]
The world is facing a maternal health crisis. According to the World Health Organization, approximately 810 women die each day from preventable causes related to pregnancy and childbirth. Her two-thirds of these deaths occur in sub-Saharan Africa. In Rwanda, one of the leading causes of maternal mortality is infected caesarean wounds.
A multidisciplinary team of physicians and researchers from MIT, Harvard University, and Partners in Health (PIH) in Rwanda have proposed a solution to address this problem. They developed a mobile health (mHealth) platform that uses artificial intelligence and real-time computer vision to predict caesarean wound infections with an accuracy of approximately 90%.
“Early detection of infection is a critical global issue, but in resource-poor areas such as rural Rwanda, there is a shortage of trained doctors and the prevalence of antibiotic-resistant bacterial infections. So the problem is even more serious,” he said. Richard Ribon Fletcher ’89, SM ’97, PhD ’02, Research Scientist in Mechanical Engineering at MIT and technical leader of the team. “Our idea was to employ mobile phones that could be used by community health workers to visit new mothers at home and examine wounds to detect infection.”
This summer, the team, led by Harvard Medical School Professor Bethany Hedt-Gauthier, won the first prize of $500,000 in the NIH Technology Accelerator Challenge for Maternal Health.
“The lives of women who give birth by caesarean section in developing countries are at risk due to limited access to quality surgery and postnatal care,” said PIH team member Fredrick Kateera. I will add. “The use of mobile health technology for early identification and plausible and accurate diagnosis of those suffering from surgical site infections within these communities is a scalable game-changer for optimizing women’s health. will be.”
A training algorithm for detecting infections
The start of the project was the result of several chance encounters. In 2017, Fletcher and Het Gauthier clashed on the Washington Metro during an NIH investigative conference. Her Hedt-Gauthier, who had been working on her five-year research project in Rwanda at the time, was looking for a solution to the caesarean section gap she and her collaborators had encountered in their research. Specifically, she was interested in using cell phone cameras as diagnostic tools.
Fletcher, who leads a group of students in Professor Sanjay Sarma’s AutoID Lab and has spent decades applying telephony, machine learning algorithms, and other mobile technologies to global health, is a natural fit for the project. was doing.
“When we realized that these types of image-based algorithms could support home care for post-caesarean women, we decided that Fletcher I approached him as a collaborator,” says Hedt-Gauthier.
During that same trip, Hedt-Gauthier happened to be sitting next to Audace Nakeshimana ’20, an MIT freshman from Rwanda who would later join Fletcher’s team at MIT. Under Fletcher’s guidance, during his senior year, Nakeshimana founded Insightiv, a Rwandan startup that applies AI algorithms to the analysis of clinical images, and won the 2020 annual MIT IDEAS competition. Received the highest grant.
The first step in this project was to collect a database of images of wounds taken by community health workers in rural Rwanda. They collected over 1,000 images of him, both infected and uninfected wounds, and used that data to train the algorithm.
A central problem occurred in this first dataset, collected between 2018 and 2019. Many of the photos were of poor quality.
“The quality of images of wounds collected by healthcare professionals was highly variable, and a large amount of manual work was required to crop and resample the images. These images are used to train machine learning models. So image quality and variability fundamentally limit algorithm performance,” says Fletcher.
To solve this problem, Fletcher turned to the tools he used in previous projects: real-time computer vision and augmented reality.
Image quality improvement by real-time image processing
To help community health workers capture better quality images, Fletcher and team modified the wound screener’s mobile app and paired it with a simple paper frame. The frame contained a printed calibration color pattern and another optical pattern to guide the app’s computer vision software.
Practitioners are instructed to place a frame over the wound and open an app that provides real-time feedback on camera placement. The app uses augmented reality to show a green check mark when your phone is within good range. Once in range, the rest of the computer vision software automatically balances colors, crops images, and applies transforms to compensate for parallax.
“By using real-time computer vision during data collection, we can produce beautiful, clean, uniformly color-balanced images that can be used to train machine learning models without the need to manually clean and post data. – process, ”says Fletcher.
Using a convolutional neural network (CNN) machine learning model and a technique called transfer learning, the software successfully predicted caesarean wound infection within 10 days after birth with about 90% accuracy. . Women whose infections are predicted through the app are referred to clinics where they can undergo diagnostic bacterial tests and prescribe life-saving antibiotics if needed.
The app has been well received by women and community health workers in Rwanda.
Anne Niyigena of PIH said:
Addressing Algorithmic Bias Using Thermal Imaging
One of the biggest hurdles to scaling this AI-based technology to a more global audience is algorithmic bias. When trained on a relatively homogeneous population, such as in rural Rwanda, the algorithm works as expected and can correctly predict infections. However, the algorithm becomes less effective when images of patients with different skin tones are introduced.
To tackle this problem, Fletcher used thermal imaging. A simple thermal camera module designed to attach to a mobile phone costs about $200 and can be used to capture thermal images of wounds. The heat patterns of infrared wound images can then be used to train an algorithm to predict infection. A study published last year showed over 90% prediction accuracy when these thermal images were combined with the app’s CNN algorithm.
While more expensive than simply using a phone camera, the thermal imaging approach could be used to extend the team’s mHealth technology to a more diverse global population.
“We give our medical staff two options. In a homogeneous population like rural Rwanda, we can use a standard cell phone camera using a model trained on local population data. Otherwise, you can use a more conventional model that requires an infrared camera to be attached,” says Fletcher.
While current generation mobile apps use cloud-based algorithms to run infection prediction models, the team now has a standalone mobile app that does not require internet access and also focuses on all aspects of maternal health. It is working. From pregnancy to postpartum.
In addition to developing a library of blemish images used in the algorithm, Fletcher worked closely with former Insightiv student Nakeshimana and his team on the development of the app, an Android manufactured locally in Rwanda. I am using a phone. PIH will then conduct user testing and field-based validation in Rwanda.
Privacy and data protection are top priorities as the team aims to develop a comprehensive app for maternal health.
“Patient data privacy must be paid close attention to when developing and refining these tools. Should incorporate more data security details.
Members of the winning team are: Richard Fletcher of MIT. Robert Riviero of Brigham and Women’s Hospital. Her Adeline Boatin at Massachusetts General Hospital. Anne Niyigena, Frederick Kateera, Laban Bikorimana, and Vincent Cubaka from her PIH in Rwanda. and Audace Nakeshimana ’20, founder of Insightiv.ai.
[ad_2]
Source link