Authors: Ritik Sadh, Preeti Sharma, Priyanshu Singh, Vansh Guleria, Akthar Warsi
Abstract: Multilingual is a critical component of global communication systems. Despite significant (NMT), contextual ambiguity, low-resource language, domain adaptation persist. Enhanced by leveraging context-aware (LLMs). By integrating transformer-based architectures with contextual embeddings, the proposed approach improves semantic consistency, translation fluency, and cross-lingual transfer learning. The study BLEU and METEOR while also considering qualitative human evaluation. Results indicate that context-aware LLMs significantly outperform traditional models in handling long-range dependencies and multilingual tasks. The paper concludes with a discussion on limitations and future research directions.
International Journal of Science, Engineering and Technology