Final Technical Report: Randomized Federated Learning Methods for Nonsmooth, Nonconvex, and Hierarchical Optimization
This final technical report summarizes the outcomes of a DOE-funded project on federated scientific machine learning (FL) under nonsmooth, nonconvex, and hierarchical optimization settings. The project develops new mathematical models, algorithms, and theoretical guarantees for decentralized stochastic, bilevel, and minimax optimization problems arising in DOE mission-relevant applications. A unified framework of randomized and zeroth-order federated optimization methods is introduced, providing provable convergence, communication efficiency, and sample-complexity guarantees. The report documents algorithmic design, theoretical analysis, and empirical validation of the proposed federated learning methods. The project also contributes to workforce development through graduate training and dissemination of results via publications and seminars.
- Research Organization:
- Rutgers, The State University; University of Michigan
- Sponsoring Organization:
- USDOE
- DOE Contract Number:
- SC0023303;
- OSTI ID:
- 3011291
- Report Number(s):
- DOE-Rutgers-23303
- Country of Publication:
- United States
- Language:
- English
Similar Records
A nonsmooth nonconvex optimization algorithm for two-stage optimization problems
Local convergence analysis of an inexact trust-region method for nonsmooth optimization
Manifold Sampling for Optimizing Nonsmooth Nonconvex Compositions
Technical Report
·
Wed Mar 30 00:00:00 EDT 2022
·
OSTI ID:2205285
Local convergence analysis of an inexact trust-region method for nonsmooth optimization
Journal Article
·
Tue Feb 20 19:00:00 EST 2024
· Optimization Letters
·
OSTI ID:2477559
Manifold Sampling for Optimizing Nonsmooth Nonconvex Compositions
Journal Article
·
Wed Oct 27 20:00:00 EDT 2021
· SIAM Journal on Optimization
·
OSTI ID:1839939