Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Privacy-Preserving Federated Learning for Science: Challenges and Research Directions

Conference ·

This paper discusses the key challenges and future research directions for privacy-preserving federated learning (PPFL), with a focus on its application to large-scale scientific AI models, in particular, foundation models~(FMs). PPFL enables collaborative model training across distributed datasets while preserving privacy-- an important collaborative approach for science. We discuss the need for efficient and scalable algorithms to address the increasing complexity of FMs, particularly when dealing with heterogeneous clients. In addition, we underscore the need for developing advance privacy-preserving techniques, such as differential privacy, to balance privacy and utility in large FMs emphasizing fairness and incentive mechanisms to ensure equitable participation among heterogeneous clients. Finally, we emphasize the need for a robust software stack supporting scalable and secure PPFL deployments across multiple high-performance computing facilities. We envision that PPFL would play a crucial role to advance scientific discovery and enable large-scale, privacy-aware collaborations across science domains.

Research Organization:
Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
Sponsoring Organization:
USDOE
DOE Contract Number:
AC05-00OR22725
OSTI ID:
2538217
Country of Publication:
United States
Language:
English

Similar Records

Position Papers for the ASCR Workshop on Cybersecurity and Privacy for Scientific Computing Ecosystems
Technical Report · Mon Nov 01 00:00:00 EDT 2021 · OSTI ID:1843573

Privacy Preserving Federated Learning for Advanced Scientific Ecosystems
Conference · Sat Nov 30 23:00:00 EST 2024 · OSTI ID:2538056

Privacy Preserving Federated Learning for Advanced Scientific Ecosystems
Conference · Sat Nov 30 23:00:00 EST 2024 · OSTI ID:3002673

Related Subjects