Federated Learning (Google Blog) — your device downloads the current model, improves it by learning from data on your phone, and then summarizes the changes as a small focused update. Only this update to the model is sent to the cloud, using encrypted communication, where it is immediately averaged with other user updates to improve the shared model. All the training data remains on your device, and no individual updates are stored in the cloud. Papers on Federated Averaging and Secure Aggregation underpin it.
A Look at Deep Learning for Science — Galaxy shape modeling with probabilistic auto-encoders, finding extreme weather events in climate simulations, learning patterns in cosmology mass maps, decoding speech from human neural recordings, clustering Daya Bay data with denoising autoencoders, and classifying new physics events at the Large Hadron Collider.