FederaTrans
Federated Transformers with Multimodal Sensitive Data for Secure Learning and Collaboration in Distributed Environments: Applications in Healthcare
September 1, 2025 - August 31, 2028 | PID2024-158373OB-I00
Call For Papers: AI-Driven Computational Methods for Social Media Analysis. Deadline April 30.
Federated Transformers with Multimodal Sensitive Data for Secure Learning and Collaboration in Distributed Environments: Applications in Healthcare
September 1, 2025 - August 31, 2028 | PID2024-158373OB-I00
UGRITAI-Lab (Department of Computer Science and A.I.)
Principal Investigators
The Federated Transformers project tackles the challenge of training AI models across distributed and privacy-sensitive data environments—especially in healthcare, where patient data is both highly valuable and strictly protected. By leveraging transformer architectures within a federated learning framework, we enable secure, decentralized training on multimodal data—such as medical text, images, and multilingual records—without ever moving or centralizing sensitive information. Our research develops new methods for federated transformer training, personalized local models, communication-efficient aggregation, and robust privacy guarantees. The goal is to provide hospitals and research institutions with powerful, collaborative AI tools that respect data sovereignty and regulatory requirements, advancing both federated learning and multimodal AI in real-world distributed settings.
Grant PID2024-158373OB-I00 funded by MICIU/AEI/10.13039/501100011033 and by ERDF/EU
(MICIU: Ministry of Science, Innovation and Universities, AEI: State Research Agency, ERDF: European Regional Development Fund, Next Generation EU: EU Recovery, Transformation and Resilience Plan)