Design And Evaluation Of An AI Based Adaptive Mock Interview System Using NLP And Real-Time Feedback Analysis

22 Apr

Authors: Himanshu Kumar, Rohit Kumar, Aditya, Prof. Shivangi Patel, Prof. Nitin Pal

Abstract: Preparing for interviews is often inconsistent, as many candidates depend on repeated question lists and general advice that does not clearly show how to improve. This work presents an adaptive mock interview system that creates a more practical way to prepare by combining interaction, evaluation, and guidance in one place. The system is designed to behave like an interviewer by asking questions, examining answers, and giving feedback during the same session. To achieve this, the system processes user responses using language understanding methods and supports both typed and spoken input. Each answer is reviewed from multiple perspectives, including how well it addresses the question, how clearly it is expressed, and the overall tone of the response. Based on these observations, the system assigns a score and provides suggestions that users can apply immediately. A key feature of the system is its ability to adjust question difficulty during the session. When a user performs well, the system gradually increases the level of challenge, while weaker performance leads to simpler or more guided questions. This adjustment helps maintain balance and keeps the user engaged without making the session too easy or too difficult. The system was tested under controlled conditions to observe its behaviour across different types of responses. The results show stable performance, with timely feedback and consistent evaluation. Repeated interaction also leads to noticeable improvement, indicating that the system supports gradual learning and skill development. Overall, the proposed approach offers a structured and flexible way to practice interviews, reducing dependence on manual guidance while helping users build confidence through continuous feedback and adaptation.