Loading

Leveraging Transfer Learning for NLP in Extremely Low-Resource Language Settings
Abhay Bhatia1, Anil Kumar2

1Dr. Abhay Bhatia, Associate Professor, Department of Computer Science and Engineering, Roorkee Institute of Technology, Roorkee (Uttarakhand), India.

2Dr. Anil Kumar, Associate Professor, Department of Computer Science and Engineering, Roorkee Institute of Technology, Roorkee (Uttarakhand), India.  

Manuscript received on 30 September 2025 | Revised Manuscript received on 07 October 2025 | Manuscript Accepted on 15 October 2025 | Manuscript published on 30 October 2025 | PP: 15-19 | Volume-12 Issue-10, October 2025 | Retrieval Number: 100.1/ijies.K113212111125 | DOI: 10.35940/ijies.K1132.12101025

Open Access | Editorial and Publishing Policies | Cite | Zenodo | OJS | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: Globally, individuals are increasingly gaining access to current technologies. They facilitate unprecedented access to knowledge, justice, and information. To ensure accommodation and universal accessibility, a globally spoken language must be considered, and it is essential to facilitate computers’ comprehension of human language. The computational methodologies that exist have advanced significantly; yet these improvements can often require substantial resources, including data generation and processing. Such a reliance on resources hinders the ef icient advancement of cross-lingual transfer techniques, especially for taskslike discourse analysis that require rigorous training and evaluation across more spoken languages and domains. The existing systems consume N resources, and the best language processing methods haven’t been able to cover many languages and domains. Transfer learning seeks to solve this problem by leveraging pre-trained modelstrained on large datasets to work in resource-constrained environments. These strategies have gained popularity for their ef ectiveness in managing limited resources across diverse tasks, areas, and languages. This research focuses on the application of transfer learning approachesto Indian languages, notably Hindi and its code-mixed English-Hindi variety, which is commonly found on social networking sites. We examine cross-task and cross-lingual transferstrategies acrossseveral downstream tasks, demonstrating their effectiveness despite limited training data and computational resources. We also provide a syntactic-semantic curriculum-based learning architecture for English-Hindi code-mixed sentiment analysis, resulting in significant performance improvements.

Keywords: Cross-Lingual Transfer, Discourse Analysis, LowResource Languages, Resource-Intensive Systems, Transfer Learning, Pre-Trained Models.
Scope of the Article: Artificial Intelligence and Methods