Medical Imaging Deep Learning

Memory-Driven Transformer for Radiology

Led development of a system to generate clinically consistent radiology reports using transformer architecture with relational memory and PubMed BERT integration.

April 1, 2025
aimldeep-learningpython
01

Context

Medical imaging generates massive amounts of data that radiologists must interpret and document. This project aimed to automate the generation of clinically consistent radiology reports from medical images using state-of-the-art deep learning.

02

What I Built

A transformer-based system with a novel relational memory module that maintains context across image sequences. Integrated a pre-trained PubMed BERT model to identify abnormalities and generate medically accurate descriptions. Trained using AWS SageMaker's distributed capabilities.

03

Key Decisions

1Introduced relational memory module for maintaining clinical context
2Integrated pre-trained PubMed BERT for domain-specific language understanding
3Leveraged AWS SageMaker for distributed training on large datasets
4Designed for processing larger datasets of images and reports
04

Challenges

Ensuring clinical accuracy in generated reports
Handling large-scale medical imaging datasets
Balancing model complexity with inference speed
05

Outcomes

Achieved superior performance compared to baseline models
Successfully processed larger dataset of images and reports
Demonstrated practical application of transformers in medical domain
06

Tech Stack

PythonPyTorchAWS SageMakerBERTTransformers