In an era where digital technologies are reshaping industries and daily life, the environmental impact of AI systems has become a growing concern. This course explores efficient AI methodologies to address these challenges. From deep learning model compression to low-bit quantization and collaborative inference, we delve into techniques that enhance computational efficiency and reduce energy consumption. In Week 2, we focus on low-bit quantization specifically for large language models (LLMs), showcasing cutting-edge open-source tools and models. Join us to learn how to build sustainable AI systems while pushing the boundaries of innovation.
This course is part of the Sustainability in the Digital Age series, a collaborative project between colleagues from Stanford University, SAP and the Hasso Plattner Institute.
The course runs for two weeks with a total workload of approximately 6-10 hours.
Institutions