Interpretable cancer cell detection with phonon microscopy using multi-task conditional neural networks for inter-batch calibration

Advances in artificial intelligence (AI) show great potential in revealing underlying information from phonon microscopy (high-frequency ultrasound) data to identify cancerous cells. However, this technology suffers from the 'batch effect' that comes from unavoidable technical variations between each experiment, creating confounding variables that the AI model may inadvertently learn. We therefore present a multi-task conditional neural network framework to simultaneously achieve inter-batch calibration, by removing confounding variables, and accurate cell classification of time-resolved phonon-derived signals. We validate our approach by training and validating on different experimental batches, achieving a balanced precision of 89.22% and an average cross-validated precision of 89.07% for classifying background, healthy and cancerous regions. Classification can be performed in 0.5 seconds with only simple prior batch information required for multiple batch corrections. Further, we extend our model to reconstruct denoised signals, enabling physical interpretation of salient features indicating disease state including sound velocity, sound attenuation and cell-adhesion to substrate..

Medienart:

Preprint

Erscheinungsjahr:

2024

Erschienen:

2024

Enthalten in:

arXiv.org - (2024) vom: 26. März Zur Gesamtaufnahme - year:2024

Sprache:

Englisch

Beteiligte Personen:

Zheng, Yijie [VerfasserIn]
Fuentes-Dominguez, Rafael [VerfasserIn]
Clark, Matt [VerfasserIn]
Gordon, George S. D. [VerfasserIn]
Perez-Cota, Fernando [VerfasserIn]

Links:

Volltext [kostenfrei]

Themen:

000
570
620
Computer Science - Artificial Intelligence
Computer Science - Machine Learning
Electrical Engineering and Systems Science - Image and Video Processing
Electrical Engineering and Systems Science - Signal Processing
Quantitative Biology - Quantitative Methods

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

XAR043061753