VOLUME 6 Issue 2

5 Mar

Experimental Study of Strength and Behaviour of Clayey Soil using Sawdust Ash and Lime

Authors- Asst. Prof. Jeron.R, UG Student Saajan Simon, UG Student Sobu Geevarghese Thampan, UG Student Romy Philip, UG Student Shibin Varghese

Abstract- This study assesses the geotechnical characteristics of lateritic soil and sawdust ash lime (SDAL) mixtures. Preliminary tests were carried out on the natural soil sample for identification and classification purposes. The sawdust was mixed with lime for stabilization in the ratio 2:1. This mixture was thereafter added to the lateritic soil in varying proportions of 2, 4, 6, 8 and 10% by weight of soil. The main objective of the project was to study about stabilization of clayey soil using Saw Dust Ash and Lime. Index properties of parent soil, Atterberg Limits, compaction characteristics and UCC of both parent soil and soil treated with saw dust ash and lime were found out. All the tests were conducted by following the Indian standards guidelines. It was found that the utilization of the industrial wastes like saw dust ash is an alternative to stabilize the soil for various construction purposes however this study clearly shows that when an activator like lime is added to the sawdust ash the results are very encouraging. One can easily realize after the results that with small percentage of activator, SDA an industrial waste can be efficiently used in soil stabilization. This can reduce construction cost of the roads particularly in the rural areas of developing countries like India.

A Study of The increasing Slum Population In India And Its Effects On The Livlehood Of Urban Areas In Terms Of Geographical Perspective

Authors- Dr. K.C. Sharma, Lecturer

Abstract- While India’s economy continues to boom since the last so many decades and Swachh Bharat Abhiyaan (Clean India Mission) entering into second year, it’s 360 million poorest citizens remain among those living in some of the most dilapidated conditions in the world. The slums have become the indispensable and dark side of our country, which we can’t boast of are the slums. Due to rising population, the number of slum dwellers rising in Indian cities. Slums area always lack by some basic necessities of Life clean water, electricity and sanitation. The inhabitants are mostly rickshaw puller, sex workers, seasonal small vendors, house maid servants with a family income ranging from a meager Rs.1500 to Rs.3000. After a hard and low-earning working day, most of the men spend their daily earnings on homemade illicit liquor. Status of woman’s in slum is not respect ful, they used to do prostitution to full fill their basic needs to survive. The slum population is constantly increasing: it has doubled in the past two decades.

The Roles of Parent-Teachers’ Socioeconomic Status and Parental Involvement on Their Children’s Academic Achievement in the Time of Pandemic

Authors- Dr. John Mark Distor

Abstract- Children’s academic achievement is crucial in this time of health crisis as learning challenges and opportunities aroused. With the development of the new normal education environment, the current study evaluated the degree of parent-teachers’ involvement and socioeconomic status, as well as their relationship on children’s academic achievement. This study primarily utilized a descriptive-correlational research design. Data were gathered using an adopted survey questionnaire administered to 39 public school parent-teachers. The results of the study revealed that; (1) majority of the respondents were Bachelor’s degree holder, employed as Teacher I position with Salary Grade 11. Parent-teachers’ socioeconomic status was generally measured in terms of their highest educational attainment, teaching position, and salary grade (monthly income); (2) parent-teachers were very involved with their children’s academic achievement; (3) despite of the pandemic, majority of the children have achieved outstanding academic performance, while others have received satisfactory average grades; (4) there is significant positive correlation between teaching position and salary grade (income) of parent-teachers and their children’s academic achievement; (5) and there is no significant relationship and a comparatively tiny negative correlation between parent-teachers’ involvement and academic achievement of the children. Thus, parent-teachers have taken on a greater role, were very proactive in educating and nurturing their young children while still functioning as a teacher of other students. To offer financial support and to provide conducive home environment for learning activities for children remarkably linked their academic achievement.

Java Dump Analysis: Techniques and Best Practices

Authors- RamaKrishna Manchana

Abstract- This paper explores the methods, tools, and best practices involved in Java Virtual Machine (JVM) dump analysis, specifically focusing on thread, heap, and core dumps. The goal is to provide a comprehensive understanding of how these dumps can be utilized to diagnose performance issues, memory leaks, and application failures in Java applications.

DOI: /10.61463/ijset.vol.6.issue2.103

Leveraging Spring Boot for Biometric SaaS Applications in Hybrid Cloud

Authors- Dr. Vinayak Ashok Bharadi

Abstract- This study presents a Spring Boot-based architecture for deploying biometric authentication systems in a Software-as-a-Service (SaaS) model. Inspired by Ramakrishna Manchana’s 2017 insights into Spring Boot, the proposed framework enables batch processing for large-scale biometric datasets and integrates seamlessly with public and hybrid cloud platforms. By leveraging Java frameworks, the system ensures modularity, scalability, and efficient resource utilization. The framework’s design supports dynamic workload distribution and secure data management, making it suitable for real-time biometric applications in resource-constrained environments.

DOI: /10.61463/ijset.vol.6.issue2.104

Node Position Based Multi – Hop Routing For Fanets

Authors- Dhiren Kumar

Abstract- In this work, we provide a stateless position-based packet routing method for a flying ad-hoc network (FANET). The goal of this project is to collect all available geographical data about the network. The node is directed to its target using its address and coordinates. Here, we introduce UAV technology to improve position-based multicarrier transmission in 3D spaces. The DRL algorithm is used by this ad hoc network while in flight. Together, DRL and LAR ensure timely arrival at the final destination. In this research, we present a decentralized intelligent routing approach using deep reinforcement learning (DRL) that takes into account the state of symmetrical nodes between two hops. The approach takes into account the location, velocity, load degree, and connection quality of nodes in the establishing process of state components, allowing for a more complete understanding of the local dynamic of the network. Using the Q values computed by the model during training of Deep Q-Networks, the nodes may do an adaptive neighbor selection. The simulation and analytical results demonstrate that the suggested technique has superior performance to various widely used methods and offers strong convergence characteristics.

DOI: /10.61463/ijset.vol.6.issue2.105

A Critical Discourse Analysis of John F. Kennedy’s Inaugural Address (1961) Using Fairclough’s Model

Authors- Radwan Alkarrash

Abstract- This research applies Norman Fairclough’s Critical Discourse Analysis (CDA) to analyze John F. Kennedy’s inaugural address delivered on January 20, 1961. The speech undergoes CDA analysis to study how language builds ideological frameworks, displays power relationships, and creates political identities. The research investigates how Kennedy employed lexical choices, rhetorical devices, pronouns, metaphors and appeals to establish American global leadership while promoting domestic unity against both domestic civil rights tensions and foreign cold war pressures. The research investigates the relationship between the speech’s linguistic elements and its historical and socio-political setting.

DOI: /10.61463/ijset.vol.6.issue2.106

XSLT and Document Transformation in Workday Integrations: Patterns for Accurate Outbound Data Transmission

Authors: Santhosh Kumar Maddineni

Abstract: Accurate and reliable data transmission is critical in Workday integrations, especially when sending information to external vendors, payroll systems, and regulatory bodies. This paper explores the use of XSLT (Extensible Stylesheet Language Transformations) in Workday integrations to enable precise and flexible outbound document formatting. It focuses on how XSLT is used within Workday Studio and Core Connectors to transform Workday’s XML-based output into external schema formats such as CSV, JSON, or vendor-specific XML layouts. Common transformation patterns are discussed, including field mapping, conditional logic, iterative processing, and dynamic value replacement. Real-world examples illustrate how XSLT enables integration teams to accommodate complex data requirements such as multi-record grouping, flattened structures, and localized formatting without altering Workday source configurations.

DOI: https://doi.org/10.5281/zenodo.16032939

Post-Production Defect Resolution in Workday Projects: Insights from Global Implementation Support

Authors: Santhosh Kumar Maddineni

Abstract: Post-production defect resolution is a critical phase in the Workday project lifecycle, where the stability, reliability, and user confidence in the system are tested in real-world conditions. This paper presents insights from global Workday implementations, focusing on structured approaches to identifying, prioritizing, and resolving defects after go-live. It outlines common categories of post-production issues—ranging from security misconfigurations and business process errors to integration failures and reporting inaccuracies—and explains how organizations can establish a sustainable support model to manage them effectively. Key strategies include the use of ticketing systems for root cause tracking, SLAs for timely resolution, and dedicated triage teams to separate urgent fixes from enhancement requests. The paper also emphasizes the importance of knowledge transfer, configuration documentation, and regression testing to reduce recurring issues. Real-world case studies highlight lessons learned from global rollouts, particularly in coordinating cross-functional teams, managing time zone differences, and maintaining consistent change control practices.

DOI: https://doi.org/10.5281/zenodo.16033517

Multi-Format File Handling in Workday: Strategies to Manage CSV, XML, JSON, and EDI-Based Integrations

Authors: Santhosh Kumar Maddineni

Abstract: Workday, as a comprehensive cloud-based Human Capital Management (HCM) and Financials platform, supports varied business integration needs by enabling seamless communication with external systems. This paper explores multi-format file handling capabilities in Workday integrations, particularly focusing on CSV, XML, JSON, and Electronic Data Interchange (EDI) files. Through an in-depth discussion of transformation strategies, real-time and batch processing approaches, and error-handling techniques, this study outlines best practices for designing resilient and scalable file-based integrations using Workday Studio, Enterprise Interface Builder (EIB), and Core Connectors.

DOI: https://doi.org/10.5281/zenodo.16034006

Dimensional Modeling Reimagined: Enhancing Performance And Security With Section Access In Enterprise BI Environments

Authors: Ajay Kumar Kota

Abstract: Dimensional modeling has long been the foundation of Business Intelligence (BI), providing a structured framework for organizing enterprise data into facts and dimensions that enable consistent reporting and analysis. However, the exponential growth of data volumes, increasing demands for real-time analytics, and heightened regulatory pressures have exposed the limitations of traditional dimensional models. This article reimagines dimensional modeling in the context of modern enterprise BI environments by focusing on two critical priorities: performance optimization and security enforcement. It first revisits the foundations of dimensional modeling, examining the strengths and challenges of star and snowflake schemas in large-scale deployments. The discussion then explores advanced performance optimization techniques such as fact table partitioning, indexing, surrogate keys, and pre-aggregated fact tables, illustrating how these approaches reduce query latency and improve dashboard responsiveness. The article further emphasizes the growing importance of data security, particularly in regulated industries, and highlights the role of Section Access in enabling fine-grained, role-based control over data visibility.

DOI: http://doi.org/10.5281/zenodo.16991809

ETL On Linux: A Practical Guide To Data Transformation And Automation On RHEL And Centos

Authors: Shreya Banerjee

Abstract: Linux-based ETL workflows are critical for enterprise data integration, analytics, and operational decision-making. This review explores ETL strategies on Red Hat Enterprise Linux and CentOS, covering extraction, transformation, and loading processes, tools, scripting techniques, and automation approaches. It examines open-source platforms, database-native methods, and workflow orchestration for scalable and maintainable pipelines. Performance optimization, logging, monitoring, and security considerations are discussed, along with practical applications in finance, healthcare, and retail. Emerging trends including cloud integration, AI-enhanced ETL, real-time processing, and containerization are highlighted to provide insights into future-ready Linux ETL pipelines. The review provides guidance for building reliable, efficient, and automated data workflows in enterprise environments.

DOI: http://doi.org/10.5281/zenodo.17278576

The Modern Etl Stack: Combining SSIS With AWS And Google Cloud For Scalable Integration

Authors: Amarjit Kaur

Abstract: The ETL landscape is evolving rapidly, driven by the need for scalable, flexible, and efficient data integration solutions. This review examines the modern ETL stack, focusing on the combination of SQL Server Integration Services (SSIS) with Amazon Web Services (AWS) and Google Cloud Platform (GCP). It explores the architecture of hybrid ETL pipelines, integration strategies, key cloud services, and best practices for automation, monitoring, and performance optimization. Case studies from finance, healthcare, and e-commerce illustrate measurable benefits, while emerging trends such as serverless architectures, AI-enhanced transformations, and multi-cloud integration highlight future directions. The insights presented aim to guide data engineers, IT professionals, and enterprises in implementing scalable, reliable, and future-ready ETL solutions that bridge legacy systems with cloud-native platforms.

DOI: http://doi.org/10.5281/zenodo.17278675

From JSON To Apex: A Guide To Handling Data From External Systems In Salesforce

Authors: Hardeep Singh

Abstract: In the modern digital ecosystem, enterprises rarely function in isolation. Data flows seamlessly between applications, systems, and platforms to ensure efficiency and enhanced customer experiences. One of the most widely used formats for data exchange is JSON (JavaScript Object Notation), favored for its lightweight structure and human readability. Within Salesforce, handling JSON data has become an essential skill to facilitate integrations with external systems, cloud services, and APIs. Apex, Salesforce’s proprietary programming language, plays a pivotal role in enabling developers to parse, manipulate, and persist JSON data. This article provides an extensive explanation of how business operations can maximize their efficiency in connecting Salesforce with outside data sources by leveraging Apex-based solutions to seamlessly consume, process, and transform JSON. It highlights the challenges faced during such integrations and their resolutions, including considerations around bulk processing, error handling, security practices, and performance optimization. Additionally, the article emphasizes best practices such as deserialization using strongly-typed Apex classes, handling dynamic JSON structures, leveraging wrapper classes, and ensuring data integrity through transactional control and validation mechanisms. By embedding JSON into Apex-based integrations, organizations foster interoperability while securely scaling communications between Salesforce and other essential systems. Given the increasing reliance on cross-application workflows in enterprise IT and customer relationship management, mastering handling JSON with Apex ensures developers and system architects can deliver robust, future-proof integration frameworks that meet today’s evolving digital demands while preparing the foundation for flexible innovation ahead.

DOI: http://doi.org/10.5281/zenodo.17278743

Building A Seamless Data Pipeline: Leveraging SSIS For Enterprise-Level Data Integration And Transformation

Authors: Rehan Akhtar

Abstract: Enterprise data integration is critical for enabling timely analytics, operational efficiency, and informed decision-making. This review explores the use of SQL Server Integration Services (SSIS) for building seamless, enterprise-level ETL pipelines. It examines architectural components, pipeline design principles, integration with heterogeneous on-premises and cloud-based data sources, and strategies for automation, scheduling, and monitoring. Case studies from finance, healthcare, and retail illustrate practical applications and measurable benefits. Emerging trends such as hybrid cloud adoption, AI-enhanced transformations, real-time ETL, and serverless architectures are also discussed, highlighting the evolution of SSIS pipelines toward scalable, intelligent, and resilient data integration frameworks. The review provides actionable guidance for enterprises seeking to optimize ETL workflows and implement future-ready data pipelines.

DOI: http://doi.org/10.5281/zenodo.17278825

Data Quality Matters: Implementing Robust Scripts For Clean, Accurate, And Reliable Data

Authors: Meena Pillai

Abstract: High-quality data is essential for accurate analytics, regulatory compliance, and informed decision-making. However, modern datasets often suffer from errors, inconsistencies, and incompleteness, leading to operational inefficiencies and unreliable insights. This review examines the implementation of robust scripts for maintaining clean, accurate, and reliable data. Key aspects include understanding data quality dimensions, addressing common challenges, applying scripting techniques for profiling, cleansing, and validation, and leveraging both open-source and enterprise tools. The review also highlights best practices for script design, automation, and integration into data pipelines. Case studies across finance, healthcare, and e-commerce demonstrate measurable improvements, while emerging trends such as AI-driven quality checks, real-time validation, and alignment with data governance frameworks indicate the future direction of scalable, intelligent data quality management. The insights provided aim to guide data engineers, analysts, and organizations in establishing resilient and effective data quality practices.

DOI: http://doi.org/10.5281/zenodo.17278995