CGRA-DeBERTa Concept Guided Residual Augmentation Transformer for Theologically Islamic Understanding
New transformer model beats BERT and DeBERTa by 8+ points on Hadith question-answering with only 8% inference overhead.
Researchers from Pakistan and Saudi Arabia developed CGRA-DeBERTa, a specialized transformer for Islamic theological understanding. It uses a Concept Guided Residual Augmentation framework with a curated dictionary of 12 core terms and a gating mechanism (1.04-3.00 scaling). Trained on 42,591 QA pairs from Sahih al-Bukhari and Sahih Muslim, it scored a 97.85 EM, surpassing DeBERTa's 89.77. The model enables accurate, efficient span extraction from classical Islamic texts for educational applications.
Why It Matters
Enables precise, nuanced AI tools for religious education and scholarship, addressing a major gap in domain-specific language models.