Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
This issue highlights advances in applications of machine learning for diagnosing disease and for sorting and classifying health data, and includes a framework for interpreting multiplexed imaging data to delineate tumour heterogeneity, modelling perturbations that boost T cell infiltration into tumours using counterfactual learning of spatial proteomics data, a multimodal machine-learning model to stratify breast cancer risk, applying graph representation learning to identify cancer genes, an unsupervised deep-learning framework to analyse cancer transcriptomes, a solution for data scarcity in machine learning using cascaded diffusion models and model auditing with expert insights from dermatologists.
The cover illustrates that combining the expertise of physicians to identify medically relevant features in dermatology images with generative machine learning enables auditing of medical-image classifiers.
Leveraging the expertise of physicians to identify medically meaningful features in ‘counterfactual’ images produced via generative machine learning facilitates the auditing of the inference process of medical-image classifiers, as shown for dermatology images.
A modular model integrating clinical metadata and mammography and tri-modal ultrasound images from patients presenting to the clinic with breast cancer symptoms performs similarly or better than experienced human experts at differential diagnosis and tumour classification.
The inference process of medical-image classifiers can be audited by levering the expertise of physicians to identify medically meaningful features in ‘counterfactual’ images produced via generative AI.
A deep-learning model can accurately quantify circulating tumour DNA from the density distribution of cell-free DNA-fragment lengths in plasma from patients with cancer and from healthy individuals.
Cascaded diffusion models can be used to synthesize realistic whole-slide image tiles from latent representations of RNA-sequencing data from human tumours.
A multimodal model for the stratification of breast cancer risk based on clinical metadata, mammography and trimodal ultrasound images performed as well as or better than radiologists at tumour classification and at differential diagnosis.
An interpretable transformer-based model leveraging graph representation learning accurately predicts cancer genes across homogeneous and heterogeneous pan-cancer networks of biological interactions.
Minimal tumour perturbations that boost T-cell infiltration can be discovered by using deep learning to analyse large-scale spatial omics profiles of tumours.
Self-supervised representation learning on data from imaging mass cytometry can be used to distinguish morphological differences in tumour microenvironments and to precisely characterize distinct microenvironment signatures.