As part of my team's submission for the IBM/Covalent challenge of the 2023 MIT iQuHack hackathon, we implemented a Hybrid Classical-to-Quantum Neural Network designed to detect dementia severity from brain MRI scans.
Links:
Motivations:
Since the motivation behind this project was dictated by the challenge prompt, I will instead talk about why we chose this specific network architecture for this use case.
The hybrid network structure was chosen to combine the strengths of both classical and quantum computing. By using a pretrained classical Convolutional Neural Network (CNN) to extract rich feature vectors from MRI images, we were able to take advantage of "transfer learning", which helped us get a working prototype running more quickly for the hackathon. This approach removes the need to train the quantum net on rich image data, as the classical CNN already provides generalized, high-level representations of the input data.
While quantum neural networks are not very practical yet, the parallelization which can be achieved theoretically allows quantum models to be trained more efficiently than classical models. Given time for the hardware to catch up, this would allow for our model to retrain often and stay up to date as its MRI dataset grows.
Implementation:
Our original plan for this project was to connect our model to a React Native frontend, aiming to make the service easily accessible to users via their mobile devices. Having since revisited and polished the project, the model is now available as a Gradio demo on HuggingFace where anyone can try it.
The model is implemented in Python using PyTorch and Pennylane, and used a preprocessed version of the MRI and Alzheimer's dataset from Kaggle.
Future Work:
- Train on larger dataset
- Investigate a better paradigm for inferring dementia severity (classification may not be the best model type for this task)