¸£ÀûÔÚÏßÃâ·Ñ

Study-Unit Description

Study-Unit Description


CODE ARI5121

 
TITLE Applied Natural Language Processing

 
UM LEVEL 05 - Postgraduate Modular Diploma or Degree Course

 
MQF LEVEL 7

 
ECTS CREDITS 5

 
DEPARTMENT Artificial Intelligence

 
DESCRIPTION Natural language processing (NLP) is an important subfield of Artificial Intelligence. This study-unit presents a comprehensive and practical exploration of cutting-edge NLP techniques, emphasizing on deep learning methods and state-of-the-art models. It will address complex NLP applications and research advancements in NLP. Students will engage with advanced neural architectures including Transformers, large-scale pretrained language models (LLMs), multilingual NLP, and sophisticated generative models. They will also be exposed to Speech Processing techniques. Applied projects will leverage frameworks such as Hugging Face, PyTorch, and TensorFlow to solve real-world tasks. Ethical considerations, bias mitigation and model interpretability will also be integrated throughout.

Study-Unit Aims:

The study-unit aims to deepen the students’ understanding and practical skills in contemporary NLP techniques, focusing on advanced neural architectures and state-of-the-art language models. Students will be equipped with the ability to design, fine-tune, and deploy LLMs for complex tasks, including multilingual and generative applications. Students will develop the ability to critically evaluate and interpret NLP models, including identify biases, limitations and ethical concerns. Moreover, students will have the opportunity to apply this knowledge to Maltese and other low-resource languages, highlighting the specific linguistic challenges and potential solutions.

Learning Outcomes:

1. Knowledge & Understanding:

By the end of the study-unit the student will be able to:

- Explain the Transformer architecture;
- Explain the differences between encoder-only, decoder-only, and encoder-decoder architectures and their suitability for various NLP tasks;
- Fine-tune BERT-based models;
- Explain the differences between in-context learning, zero-shot learning, few-shot learning;
- Design and deploy different strategies, including parameter-efficient fine-tuning methods (e.g. LoRA, adapters, etc.);
- Articulate methods for bias detection, ethical AI practices, interpretability, and explainability in deep NLP models;
- Discuss the state-of-the-art, with a focus on low-resource NLP.

2. Skills:

By the end of the study-unit the student will be able to:

- Implement and fine-tune pretrained BERT models for specific NLP tasks;
- Develop NLP applications addressing issues such as limited data, morphological complexity and resource scarcity;
- Apply techniques for interpretability and explainability in NLP models;
- Critically evaluate NLP models, identify and mitigate biases, and implement responsible AI practices in applied NLP projects.

Main Text/s and any supplementary readings:

Main Texts:

- Jurafsky, D. & J. H. Martin (2025). Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition with Language Models. (3rd edition). Online release, January 2025.

Supplementary Readings:

- T. Mitchell (1998). Machine learning. McGraw Hill.
- Manning, C. D., and Schütze, H. (1999) Foundations of Statistical Natural Language Processing. MIT Press, Cambridge Massachusetts.
- Bird, S., Klein E. and Loper, E. (2009) Natural Language Processing with Python, O'Reilly.
- D. Maynard, K. Bontcheva, I. Augenstein. (2016) Natural Language Processing for the Semantic Web.

 
STUDY-UNIT TYPE Ind Study, Lecture, Ind Online Learning & Project

 
METHOD OF ASSESSMENT
Assessment Component/s Assessment Due Sept. Asst Session Weighting
Project SEM2 Yes 100%

 
LECTURER/S Claudia Borg
Andrea De Marco
Marc Tanti

 

 
The University makes every effort to ensure that the published Courses Plans, Programmes of Study and Study-Unit information are complete and up-to-date at the time of publication. The University reserves the right to make changes in case errors are detected after publication.
The availability of optional units may be subject to timetabling constraints.
Units not attracting a sufficient number of registrations may be withdrawn without notice.
It should be noted that all the information in the description above applies to study-units available during the academic year 2025/6. It may be subject to change in subsequent years.

/course/studyunit