The presentation series “Train your engineering network” on diverse topics of Machine Learning addresses all interested persons at TUHH, from MLE partners as well as from the Hamburg region in general and aims at promoting the exchange of information and knowledge between these persons as well as their networking in a relaxed atmosphere. Thereby, the machine learning activities within MLE, TUHH and in the wider environment shall be made more visible, cooperations shall be promoted and also interested students shall be given an insight.
Emin Nakilcioglu - Parameter Efficient Fine Tuning for a Domain-Specific Automatic Speech Recognition
With the introduction of early pre-trained language models such as Google’s BERT and various early GPT models, we have seen an ever-increasing excitement and interest in foundation models. To leverage existing pre-trained foundation models and adapt them to specific tasks or domains, these models need to be fine-tuned using domain-specific data. However, fine-tuning can be quite resource-intensive and costly as millions of parameters will be modified as part of training. PEFT is a technique designed to fine-tune models while minimizing the need for extensive resources and cost. It achieves this efficiency by freezing some of the layers of the pre-trained model and only fine-tuning the last few layers that are specific to the downstream task. With the help of PEFT, we can achieve a balance between retaining valuable knowledge from the pre-trained model and adapting it effectively to the downstream task with fewer parameters.
Lectures will be held online via Zoom on Mondays starting at 16:00 in the winter semester 2023 in English. General zoom link for all lectures: Link