This talk presents the Large Wireless Model (LWM), the world’s first foundation model for wireless channels. Inspired by the success of foundation models in NLP, speech, and vision, LWM is a transformer-based model pre-trained in a self-supervised fashion on large-scale diverse wireless datasets. It learns rich, universal contextualized channel embeddings (features) that potentially enhance performance across a wide range of downstream tasks. I will present the model’s architecture, its self-supervised pre-training approach, and training datasets. I will also demonstrate its gains in tasks like sub-6GHz to mmWave beam prediction, LoS/NLoS classification, and localization. These gains highlight the LWM’s ability to learn from large-scale wireless data and enable complex machine learning tasks with limited data in wireless communication and sensing systems.
Finally, we introduce an ITU AI/ML 5G competition which provides a modular setup, where participants can innovate on scenario design, feature extraction, and lightweight downstream models, pushing the frontiers of robustness, generalizability, and interpretability. By contributing improved scores and model refinements, the challenge also opens doors for discussion on formats, reproducible simulations, and alignment with 6G use cases. The outcomes are expected to influence real-world deployments, research reproducibility, and standard frameworks for wireless AI.
Learning Objectives:
Institution
Universität Hamburg
Adeline Scharfenberg
Universität Hamburg
Adeline Scharfenberg
Universität Hamburg
Adeline Scharfenberg