Analysis and Prediction of Crime Detection Techniques Using Machine Learning Approach
Authors – Sameeksha Bhati, Assistant Professor Priyanshu Dhameniya
Abstract- – The field of study known as machine learning examines how machines can learn to act autonomously. Self-driving cars, speech recognition, web search, and a deeper understanding of the human genome are just a few of the recent applications of machine learning. In addition, it has made it possible to make crime forecasts using historical data. Using nominal class labels, classification is a supervised prediction method. Weather forecasting, medical care, banking, homeland security, and corporate intelligence are just few of the numerous fields that have benefited from classification [6]. Data gathering, classification, pattern recognition, prediction, and visualization are typical steps in a machine learning-based approach to analyzing criminal behavior. Association analysis, classification and prediction, cluster analysis, and outlier analysis are examples of classic data mining methods that focus on structured data; newer methods can also extract useful insights from unstructured data.
Analysis Of Symptoms And Severe Outcomes Of Covid-19
Authors – Kavita Sheoran, Geetika Dhand, Shaily Malik, Nishtha Jatana
Abstract- – An outbreak of pneumonia, caused by Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) commonly known as COVID-19 started in Wuhan, China, in December 2019 has become a global pandemic. It has many similarities with Severe Acute Respiratory Syndrome Coronavirus(SARS-CoV) and Middle East Respiratory Syndrome Coronavirus (MERS).In this paper, analysis of severe complications observed in other organs is done. Comparison of immune response, various symptoms, general advisory and recovery status of patients in India is also observed.
The Role of Imaging Modalities in Lung Cancer Treatment
Authors – Research Scholar V.Juliet Rani , Asst. Prof. Dr. K.K.Thanammal
Abstract- – Lung cancer is the most deadly disease in the world. This paper overviews one of the most important and challenging problems in oncology, the problem of lung cancer diagnosis using computer information system. Lung cancer is the dangerous disease that is being spread all over the world. Developing an effective computer-aided diagnosis (CAD) system for lung cancer is of great clinical importance. An effective CAD system can diagnose the lung cancer in an accurate manner. CAD systems for lung cancer have been explored in a huge number of research studies. A typical CAD system for lung cancer diagnosis is composed of four main processing steps: segmentation of the lung fields, detection of nodules inside the lung fields, segmentation of the detected nodules, and diagnosis of the nodules as benign or malignant. In order to accomplish the above four steps we must have the appropriate imaging modalities.
Design and Thermal Analysis of Double Pipe Heat Exchanger by Changing Mass Flow Rate
Authors – M.Tech. Scholar Naveen Kumar, Prof. Abhishek Bhandari
Abstract- – Heat exchangers are employed in a variety of applications, included power plants, nuclear reactors in energy production, RAC systems, self-propelled industries, food industries, heat retrieval systems, & chemical handling. The techniques of upgrading can be divided into two categories: active and passive ways. The active approach necessitates the use of peripheral forces. Discrete surface geometries are required for passive approaches. These strategies are commonly utilized to increase heat exchanger performance. Helical tubes have already been designated as among the passive heat transfer enhancement materials. Due the short construction and high heat transfer coefficient, and they will be widely employed in various industrial applications. The thermo-hydraulic performance of various configurations of gas- to-liquid double-pipe heat exchangers featuring helical fins was investigated that used a computational model based on CFD. The heat transmission, pressure drop, unit weight, and overall performance of helical or longitudinal fin configurations studied numerically simulated and the results. The effects of increasing the number of fins and Reynolds number on thermo-hydraulic performance also were investigated
Total Harmonics Distortion Performance Analysis between Micro-Inverter and Single Phase Inverter Photovoltaic Systems
Authors – Sonali Mathur, Assistant Professor Mukesh Kumar
Abstract- – Before the direct current (DC) input voltage can be turned into alternating current (AC), each of the current topologies of solar micro inverters uses a number of steps. One or more power converters could be built into each of these stages. A transformer, a filter, and a diode rectifier might also be in it. There are a very large number of both active and passive parts. In the scope of this thesis, a brand-new architecture for a solar micro inverter is made. This new micro inverter is made up of a new single switch inverter, which is made by changing the single ended primary inductor DC-DC converter that was already there. This new inverter can take DC power and turn it into a clean sinusoidal waveform. A lot of research is being done on how the new inverter is built and how it works. With the help of a controller, this new inverter can make almost any kind of output waveform. The inverter was found to be able to work in a total of four different ways. The new inverter was designed using a modelling method called “state space averaging.” Due to the switching that is built into the circuit, the system is a non-linear fourth order system. This makes the system not follow a straight line. Before the system can be looked at as a linear system, it must be linearized around a certain point. It has been found that the inverter’s control-to-output transfer function does not have a minimum phase. To find out about transfer functions, the root locus method is used. From the point of view of control, the presence of right half zero makes it harder to build the structure of the controller. The cell equations are used to make a model of the photovoltaic (PV) cell in MATLAB. The maximum power point tracking (MPPT) method is used to make sure that the PV cell’s output power is always at its highest level. This lets the power from the PV cell be used to its fullest potential. The easiest way to solve this problem is to change something and then watch what happens. When you use this new inverter, you don’t need the different phases that a traditional solar micro inverter needs. The proposed design of the inverter was confirmed by both simulation and the results of experiments done on the set-up.
Remote Substation Monitoring in a Distribution Power Grid
Authors – M. Tech. Scholar Jasti Raja, Prof. Katragadda Swarnasri, Asst. Prof. Ponnam Venkata K Babu
Abstract- – Monitoring and control of distribution transformers is desired by any distribution utility company in view of many reasons. Distribution transformers are located in remote places in order to supply power to few categories. In order to provide good and reliable power automation has become an essential part in the distribution network. Monitoring and controlling essentially require the data from the grid, analysis of the data and controlling the devices on the network on the basis of evaluated results. Objective of this paper is to design the cost effective model for monitoring the remote electrical parameters like voltage, current, temperature of a transformer and send these real time values over network to a remotely located substation or a device. This system can automatically update the real time electrical parameters periodically (based on time settings). This system can be designed to send alerts whenever the relay trips or whenever the voltage or current exceeds the predefined limits. This experimental setup is a prototype of the proposed project, for demonstration purpose we have used Arduino and Raspberry Pi here. The controller can efficiently communicate with the different sensors being used by detecting the abnormal operating conditions.
A Survey on Image Water Marking Techniques and Attacks
Authors – Dilesh Khairwar, Asst. Prof. Sumit Sharma
Abstract- – Digital information may be transferred from one location to another with the least amount of difficulty than any other media. Text, music, video, and image data, among other types of data, can all be transferred using the same media and the same methods. However, certain precautions have been made by the owner of the data by embedding some signature or validating information at the receiver end. The security of these data is greatly dependent on the protocols. This article does a comprehensive review of several approaches to the protection of digital image data that have been offered by researchers. In the study, signature embedding techniques and their attributes were broken down in detail in order to better the reader’s grasp of the subject field. In this research, different network assaults that may have an effect on the data that was received were also elaborated upon. The study also provided an explanation of the various aspects that researchers make use of to secure digital data. This is because each feature has its own significance and area of use that varies according to the type of image and the attacks that are being made.
AI-Augmented User Access Analytics In Centrify-Managed Environments
Authors: Pooja Sharma, Ankit Mehra, Shalini Nair, Rohit Chauhan
Abstract: As enterprises grow increasingly reliant on identity-centric security models, managing and auditing privileged access has become paramount especially in regulated environments such as healthcare, government, and finance. Centrify, a leading privileged access management (PAM) platform, offers comprehensive vaulting, session control, and policy enforcement. However, static access control methods alone often fail to detect nuanced insider threats, credential misuse, or abnormal behavioral patterns. This review explores the integration of artificial intelligence into Centrify-managed UNIX and hybrid environments to enhance user access analytics and proactively detect risks. We examine how machine learning techniques ranging from supervised classification to anomaly detection and time-series modeling can be used to analyze session metadata, command histories, vault activity, and authentication behavior. The paper outlines the architecture of AI-enhanced pipelines, data collection strategies, real-time alerting systems, and integration points with Centrify’s policy engine. We also evaluate the implications of AI-based adaptive access controls, context-aware role adaptation, and forensic replay for audit and compliance. Through detailed sections on threat modeling, deployment strategies, and federated learning approaches, this review positions AI as a transformative layer over traditional access control. Ultimately, AI-augmented user access analytics enable more intelligent, responsive, and resilient identity governance—essential for maintaining Zero Trust postures and meeting modern regulatory requirements.
Implementing Samba Clustering Techniques To Achieve High Availability And Fault-Tolerant File Sharing In Enterprise Network Environments
Authors: Aruni Kashyap
Abstract: As enterprises increasingly demand continuous data availability and resilience in their file-sharing infrastructures, clustering Samba has emerged as a critical strategy for achieving fault tolerance and high availability. Samba, a powerful open-source software suite, provides seamless file and print services across various operating systems, notably integrating Linux/Unix servers into Windows-based environments. However, single-node Samba configurations pose significant risks of service disruption due to hardware or software failures. Clustering Samba mitigates these risks by deploying multiple redundant nodes that ensure uninterrupted access to shared resources. This article explores the conceptual and technical underpinnings of clustered Samba configurations, examining how they bolster file-sharing reliability, maintain service continuity, and simplify management within enterprise ecosystems. We discuss key architectural designs such as active-active and active-passive clustering, delve into the technologies enabling Samba clustering—including CTDB (Clustered Trivial Database), Pacemaker, and Corosync—and analyze their roles in sustaining high availability. Additionally, the article investigates best practices, real-world deployment models, performance considerations, and security implications. With digital infrastructure demands evolving rapidly, the clustering of Samba for fault-tolerant file sharing represents a critical enabler of IT service continuity. By synthesizing architectural guidance with practical implementation strategies, this article offers a comprehensive blueprint for IT architects and system administrators aiming to optimize Samba for resilience and uptime in both on-premises and hybrid cloud environments.
Enhancing Samba Performance For High-Bandwidth Media Streaming Platforms Through Efficient Configuration And Network Resource Management
Authors: Jerry Pinto
Abstract: Samba, the open-source implementation of the SMB/CIFS protocol, has become a vital component in enabling file sharing across heterogeneous systems. In media streaming platforms where high throughput, low latency, and efficient concurrency are paramount, Samba's optimization directly influences performance, user satisfaction, and system scalability. This article explores the technical intricacies and performance tuning strategies for Samba in the context of media streaming, including caching mechanisms, transport-layer considerations, and filesystem interactions. Media streaming demands sustained data transfer rates for large media files, making Samba's configuration and tuning critically important for ensuring uninterrupted playback and robust access control. By drawing on real-world implementations and performance benchmarks, this article identifies bottlenecks in default Samba deployments and presents engineering solutions to enhance stream-read efficiency, reduce CPU utilization, and optimize memory handling. Moreover, it investigates the synergy between Samba and network file systems (NFS), SSD storage, and modern Linux kernel features like asynchronous I/O and systemd enhancements. As video consumption and digital content delivery grow exponentially, refining Samba for media workloads becomes essential. This paper serves as a practical guide for system architects, DevOps teams, and media IT infrastructure planners aiming to align Samba services with the stringent demands of modern streaming platforms.
Implementing Blockchain Technology For Secure, Transparent, And Decentralized Access Control In Modern File System Architectures
Authors: Namita Gokhale
Abstract: In the digital era, data breaches and unauthorized access to sensitive information have become critical concerns, prompting the need for robust access control mechanisms. Traditional access control models such as Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC) have served as the foundation of enterprise security strategies. However, these models often exhibit vulnerabilities tied to centralization, including single points of failure, susceptibility to insider threats, and limited audit capabilities. Blockchain technology, known for its decentralized architecture and immutable ledger, introduces a paradigm shift in how access control can be enforced across distributed systems. This paper delves into the concept of blockchain-based access control systems tailored for file systems, exploring the foundational technologies, implementation strategies, security implications, and future outlook. Through the use of smart contracts, access decisions can be enforced automatically, eliminating the need for human intervention and enhancing policy adherence. Furthermore, blockchain’s transparency enables comprehensive and tamper-proof auditing of access logs, ensuring accountability across all levels of the organization. Key considerations such as scalability, identity management, policy customization, and integration with traditional infrastructure are thoroughly discussed. Challenges, including transaction throughput limitations, storage constraints, and regulatory compliance, are addressed with emerging solutions such as Layer 2 protocols and privacy-preserving technologies. Real-world implementations and use cases further underscore the practical viability of the approach. In summary, blockchain-based access control for file systems offers a future-ready solution that aligns with the security, transparency, and auditability demands of modern enterprises.
Implementing Micro-Segmentation Strategies To Strengthen Security And Isolate Cloud Workloads In Virtualized And Multi-Tenant Environments
Authors: Nayantara Sahgal
Abstract: Micro-segmentation has emerged as a pivotal strategy in securing cloud workloads in modern enterprise environments. As cloud adoption accelerates, traditional perimeter-based security models are proving inadequate against increasingly sophisticated threats that target lateral movement within data centers. Micro-segmentation enables fine-grained policies that isolate workloads and control traffic based on identity, context, and application-level logic. This minimizes the attack surface and significantly reduces the risk of breaches propagating across systems. By using software-defined networking (SDN) and policy-driven automation, organizations can dynamically segment workloads without physical network changes, thus ensuring operational efficiency. This paper explores the conceptual framework of micro-segmentation, its technical implementation in multi-cloud and hybrid environments, and its synergy with identity and access management (IAM), zero trust principles, and DevSecOps practices. We also discuss challenges such as policy sprawl, visibility constraints, and compliance mapping, while presenting use cases that illustrate real-world benefits. The increasing complexity and dynamism of cloud-native applications make micro-segmentation not just an enhancement, but a necessity in cloud workload security strategies.
Designing And Deploying Energy-Efficient Server Infrastructure To Optimize Power Consumption In Densely Populated Urban Environments
Authors: Neel Mukherjee
Abstract: With the rapid expansion of digital infrastructure and cloud computing services, urban centers are experiencing an unprecedented demand for data processing and storage capabilities. This escalating requirement places immense stress on energy resources, especially when server deployments are not optimized for efficiency. The need for sustainable energy practices in data centers and server farms becomes paramount to mitigate environmental degradation, reduce operational costs, and support resilient urban ecosystems. This article explores the multifaceted approach required to achieve energy-efficient server deployment in densely populated areas. It examines the intersection of technology, urban planning, regulatory policies, and innovative cooling and power management techniques. Emphasis is placed on next-generation server hardware, modular deployment strategies, edge computing architectures, and renewable energy integration. Additionally, the study addresses socio-economic and environmental impacts, proposing a comprehensive roadmap to sustainability. By highlighting best practices and real-world case studies, the article aims to contribute to a paradigm shift in how cities manage their digital infrastructure. Ultimately, energy-efficient server deployment is not only a technological imperative but also a critical step toward achieving smarter, greener, and more livable urban environments.