2024 : 10 : 31
Fateme Daneshfar

Fateme Daneshfar

Academic rank: Assistant Professor
ORCID:
Education: PhD.
ScopusId: 35078447100
HIndex:
Faculty: Faculty of Engineering
Address: Department of Computer Engineering, Faculty of Engineering, University of Kurdistan
Phone:

Research

Title
Using Deep Learning Transformers for Semantic Similarity in a Sorani Kurdish Question Answering System
Type
Thesis
Keywords
Question-Answering System, Sorani Kurdish, Deep Learning Models, Transformer Models
Year
2024
Researchers Abrar Amin Saeed(Student)، Fateme Daneshfar(PrimaryAdvisor)

Abstract

This research focuses on developing a question-answering system for the Sorani Kurdish language using advanced deep-learning models such as BERT, GPT, and T5. The main objective of this system is to provide accurate and relevant answers to user queries while considering the limitations in processing low-resource languages. A dataset containing 1,000 pairs of questions and answers in Sorani Kurdish was used to evaluate the models. This data was loaded and preprocessed for training and evaluation of the transformer models. The performance of the models was assessed using common metrics such as accuracy, precision, recall, and F1 score. The evaluation results indicate that the BERT model achieved the best performance among the models, with an accuracy of 0.98 and high precision and recall scores. The T5 model ranked second with an accuracy of 0.86 and an F1 score of 0.83, while the GPT model performed significantly weaker and required further optimizations. These findings suggest that transformer models, especially BERT and T5, are more suitable for processing low-resource languages.