Federated Learning: Training Models Without Sharing Raw Data

As machine learning programs require ever-larger sets of data to train and improve, traditional central training routines creak under the burden of privacy requirements, inefficiencies in operations, and growing consumer skepticism. Liability information, such as medical records or payment history, can’t easily be collected together in a place due to ethical and legal restrictions.

Federated learning (FL) has a different answer. Rather than forwarding data to a model, it forwards the model to the data. Institutions and devices locally train models on their own data and forward only learned updates, not data.

This article has been indexed from DZone Security Zone

Read the original article: