About This Architecture
Federated Learning Secure Round Protocol orchestrates encrypted model training across distributed clients and a central server using five sequential phases. The Federated Server derives encryption keys, transmits encrypted models to Federated Clients via AES-GCM, and clients decrypt, train locally, then upload encrypted gradients back to the server. The server decrypts gradients and performs federated averaging to update the global model while maintaining end-to-end encryption throughout the round. Fork this diagram to customize key derivation functions, adjust encryption algorithms, or integrate with your federated learning framework. This architecture demonstrates privacy-preserving collaborative learning where neither server nor clients expose raw data or unencrypted model updates.