Tournament-Based Pretraining to Accelerate Federated Learning
Advances in hardware, proliferation of compute at the edge, and data creation at unprecedented scales have made federated learning (FL) necessary for the next leap forward in pervasive machine learning. For privacy and network reasons, large volumes of data remain stranded on endpoints located in geographically austere (or at least austere network-wise) locations. However, challenges exist to the effective use of these data. To solve the system and functional level challenges, we present an three novel variants of a serverless federated learning framework. We also present tournament-based pretraining, which we demonstrate significantly improves model performance in some experiments. Overall, these extensions to FL and our novel training method enable greater focus on science rather than ML development.
- Research Organization:
- Argonne National Laboratory (ANL), Argonne, IL (United States)
- Sponsoring Organization:
- USDOE Office of Science (SC); National Science Foundation (NSF)
- DOE Contract Number:
- AC02-06CH11357
- OSTI ID:
- 2280826
- Resource Relation:
- Conference: 9th International Workshop on Data Analysis and Reduction for Big Scientific Data held in conjunction with the 2023 International Conference for High Performance Computing, Networking, Storage, and Analysis, 11/12/23 - 11/12/23, Denver, CO, US
- Country of Publication:
- United States
- Language:
- English
Similar Records
Performance Evaluation of Vertical Federated Machine Learning Against Adversarial Threats on Wide-Area Control System: Preprint