Riboformer: A Deep Learning Framework for Predicting Context-Dependent Translation Dynamics

Abstract Translation elongation is essential for maintaining cellular proteostasis, and alterations in the translational landscape are associated with a range of diseases. Ribosome profiling allows detailed measurement of translation at genome scale. However, it remains unclear how to disentangle biological variations from technical artifacts and identify sequence determinant of translation dysregulation. Here we present Riboformer, a deep learning-based framework for modeling context-dependent changes in translation dynamics. Riboformer leverages the transformer architecture to accurately predict ribosome densities at codon resolution. It corrects experimental artifacts in previously unseen datasets, reveals subtle differences in synonymous codon translation and uncovers a bottleneck in protein synthesis. Further, we show that Riboformer can be combined within silicomutagenesis analysis to identify sequence motifs that contribute to ribosome stalling across various biological contexts, including aging and viral infection. Our tool offers a context-aware and interpretable approach for standardizing ribosome profiling datasets and elucidating the regulatory basis of translation kinetics..

Medienart:

Preprint

Erscheinungsjahr:

2023

Erschienen:

2023

Enthalten in:

bioRxiv.org - (2023) vom: 02. Mai Zur Gesamtaufnahme - year:2023

Sprache:

Englisch

Beteiligte Personen:

Shao, Bin [VerfasserIn]
Yan, Jiawei [VerfasserIn]
Zhang, Jing [VerfasserIn]
Buskirk, Allen R. [VerfasserIn]

Links:

Volltext [kostenfrei]

Themen:

570
Biology

doi:

10.1101/2023.04.24.538053

funding:

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

XBI039396983