Imprint 2024
June - Dec 2025
Showcase of work carried out by the participants of first cohort 2024
June - Dec 2025
Showcase of work carried out by the participants of first cohort 2024
Colon and Lung Cancer Classification
Tushar Jaware, Ravindra Badgujar, Jitendra Patil, Vijay Nerkar, Jignesh Bhatt
Cancer is one of leading worldwide health issues, with colon as well as lung cancers being amid utmost prevalent and deadly forms. Timely and precise diagnoses of these cancers can greatly enhance outcomes for patients. This study presents a deep learning-based approach concerning classification of lung and colon cancer histopathology images using LC25000 dataset. DenseNet121 was used to categorize colon cancer into one of two categories with high accuracy of 98.5%. The use of GoogLeNet again was used in classifying lung cancer into three categories, recording an accuracy of 90%. The developed methodology validates the possibility of using the CNNs in order to improve theprecision of diagnoses while overcoming inter-observer variation and time-consuming traditional ways of diagnosis. These results represent the ability as AI- powered solutions to be applied toward Assessing clinically as well as toward the development of cancer diagnostic capabilities.
Index Terms—Cancer, Colon, Lung, Histopathological Images, DenseNet, GoogleNet
GL-Net: Retinal Blood Vessel Segmentation using Global and Local Level Feature Learning
Nadeem Akhtar, Kiran Salunke, Amit Mahire, Bhushan Patil, Ashish Phophalia
The accurate segmentation of the retinal blood vessels is very crucial in the diagnosis and management of ophthalmic related diseases like diabetic retinopathy and glaucoma. Increased number of down samplings in U-net based architectures loses the most important information about thin vessels, hence results in misclassification of thin vessels in the segmented images. This paper introduces a new approach that uses deep learning methods to improve the thin retinal vessel segmentation in low contrast images. In this we proposed a Global and local (GL-Net) level feature learning about the blood vessel, which improves the detection ability of thick and thin blood vessels. The proposed method includes preprocessing of DRIVE dataset followed by testing and performance evaluation. The preprocessing includes the green channel extraction, followed by CLAHE and gamma correction on extracted patches. The pre-processed patches are trained on GL-Net. The proposed model found superior in terms of mean IoU and mean BF score with the state of art methods. The ability of proposed model to effectively segment the thin vessels will help in facilitating the identification of diseases in the early stages and enhancing the quality of life of patients.
Index Terms—Preprocessing techniques, global and local feature learning, Retinal vessel segmentation, Retinal images, Light weight models.
Optimizing Solar Energy Predictions: A Comparative Study of Machine Learning Models
Joanne Gomes, Ekta Desai, Jignesh Patel
Abstract—Solar power prediction is a vital area of research, with numerous studies utilizing weather data to enhance the accuracy of energy forecasts. While high-dimensional input feature sets have been shown to improve prediction accuracy, they often pose challenges such as increased computational complexity, data requirements, and the potential introduction of noise due to irrelevant or redundant features. This study explores the potential to achieve comparable or acceptable prediction accuracy, measured in terms of R2, by reducing the input feature set to include only the most relevant features, ensuring minimal loss of critical information. Using weather datasets from five California cities, we employ advanced feature selection techniques to systematically identify and retain features that contribute most significantly to model performance. We use advanced machine learning algorithms, including Gradient Boost, nuSVR, and MLP, to evaluate the prediction performance. Our key findings reveal that for same-city projections with the full 21 features, the highest R2 value of 0.9699 is achieved using Gradient Boost, while with a reduced set of 10 features, a comparable R2 of 0.9585 is obtained using nuSVR. For cross-city projections, we leverage transfer learning to adapt models trained on one city to another, achieving the highest R2 of 0.8882 with 21 features using MLP. Notably, with a reduced set of 10 features, transfer learning improves performance further, yielding an R2 of 0.9014 using MLP. Reducing the number of features not only simplifies model computation but also lowers data collection, preprocessing, and deployment costs, making prediction models more efficient and scalable. These findings demonstrate the potential of transfer learning and feature reduction to advance robust and interpretable solar power forecasting methodologies, with significant implications for renewable energy management and smart grid optimization.
Index Terms—Solar Power Prediction, Feature Selection, Transfer Learning, Machine Learning Algorithms.
Crop Disease Identification from Images Using CNN
Nikhil Borse, Jayesh Chaudhari, Bhushan Patil, Anup Patil, Ravi Nahta
Abstract—This paper discusses the development of a web application designed to detect plant diseases, with a focus on cotton crops. Agriculture is crucial in India due to population growth and rising food demands, but plant diseases pose a major threat to crop production, causing economic losses and food insecurity. Early and accurate disease detection is essential for effective intervention. Traditional manual methods of detection are time-consuming, error-prone, and unreliable. The proposed web application uses a Convolutional Neural Network (CNN) to classify cotton leaves as healthy or diseased based on images. The system incorporates image pre-processing techniques for enhancement, followed by CNN-based feature extraction and classification. The application demonstrates high accuracy in identifying plant diseases, offering a reliable tool for farmers and researchers to monitor and manage crop health. The paper concludes that CNN-based plant disease detection systems have strong potential to improve disease diagnosis, provide specific information about the disease, including symptoms and recommended treatments, and ultimately contribute to better crop yields and food security.
Index Terms—Plant disease detection, CNN, Deep Learning, Image processing
Enhancing Thermal Efficiency with Fins: AI and Machine Learning Solutions
Shinde Nilesh, Kumbhar Anil, Jamadar Pradip, Abhishek Paul
Optimizing device performance, prolonging the longevity of the product, conserving energy, protecting the environment, and preventing thermal failures all depend on effective thermal management. The information above emphasizes how crucial it is to research heat transfer enhancement mechanisms such as pin fins. Pin fins are a popular passive cooling method that can be applied to enhance heat transfer efficiency in various applications. These include microchannel cooling, solar air heater ducting, and gas turbine cooling. It has been shown that these systems exhibit significant pressure decreases and heat transfer when pin fins are used, underscoring their effectiveness. Optimizing pin-fin designs and their configurations is crucial for enhancing heat transfer mechanisms, flow structure, and pressure drop reduction. With regard to various pin-fin forms and configurations, this study intends to critically evaluate the body of literature already in existence on pin fins and investigate their thermohydraulic performance in balancing heat transmission, energy consumption, and system efficiency. In order to help the research community develop more effective and efficient cooling systems, this paper will analyze the effects of various pin-fin shapes on total system performance. Furthermore, by carefully analyzing the benefits and drawbacks of various pin-fin designs and their effects on various flow kinds and heat transfer methods, this review goes beyond a basic overview. Through a critical analysis of current literature and identification of areas requiring further investigation, this study advances the development of thermal management techniques for sustainable and renewable energy sources by using AI techniques to optimize fin height and width of fin.
Index Terms—Heat Transfer, Fins, Thermal performance
Crop Yield Prediction Using AI Models: A Study on Model Performance and Accuracy
Ramya Swetha, Sravanthi K, Sonal Saluja, Kanaka Raju Kalla, Deepika Gupta
Crop yield prediction plays a vital role in precision agriculture by enabling sustainable farming and efficient resource management. This study evaluates the performance of AI models—Random Forest, AdaBoost, XGBoost, LightGBM, CatBoost, and a Hybrid Neural Network (HNN)—using a crop dataset from 2014 to 2023. The results highlight CatBoost as the leading model with an R-squared of 0.9762 and an RMSE of 3.97, while the HNN exceeds all with an R-squared of 0.9792 and notable improvements across key metrics. Statistical validations reinforce the superior ability of the HNN to capture non-linear patterns in agricultural data, highlighting its potential to improve precision, sustainability, and decision making in agriculture. These findings underscore the value of integrating machine learning and deep learning technologies to improve productivity and address agricultural challenges.
Index Terms—Keywords: Crop yield prediction, CatBoost, Hybrid Neural Network (HNN), agricultural productivity.
Predictive Maintenance Using AI: A Case Study of Mechanical Pumps Using LSTM-Based Time Series Classification
Sushrut Raijade, Gajanan Gambhire, Babu Reddy, Milan Motta, Gaurav Pareek
This study presents a predictive maintenance approach for mechanical pumps using Long Short-Term Memory (LSTM) networks. The dataset comprises time-series data from 54 samples with 51 sensor-derived features and a machine failure indicator. Preprocessing steps include normalization, rolling window application, and lagging, enhancing the dataset’s temporal features. An LSTM model was trained and evaluated, achieving 96.78% accuracy, high sensitivity, and an AUC-ROC score indicative of robust classification performance. This methodology demonstrates the potential of LSTM models in predictive maintenance, enabling early fault detection and reduced downtime. It also opens a window to take the same further with Time series imaging based of GAF model and CNN
Solar Panel Placement using Artificial Intelligence
Anuj Khond, Ashutosh Pataskar, Ashish Lande, Pramit Mazumdar
Abstract—Solar energy is one of abundant non conventional energy sources available on the earth. In the present research work tilt angle for the solar panel placement is predicted by using machine learning algorithms for two different cities namely Nashik and Solapur. Tilt angles play a very crucial role in maximizing the efficiency of solar panels. In view of this, tilt angle is predicted by using regression analysis and genetic algorithm for Nashik and Solapur. Temperature (T), Irradiance parameters such as Direct Normal Irradiance (DNI), Diffuse Horizontal Irradiance (DHI) and Global Horizontal Irradiance (GHI) were used to predict the tilt angle. Three methods are giving the same results and tilt angle is found as 15.5 degree and 13.4 degree for Nashik and Solapur respectively. These algorithms can be further used to predict the tilt angle for any other location with the available data set.
Index Terms—Keywords: Solar Panel, Irradiance, Machine Learning, Tilt angle
Integration of AI Techniques for Enhanced Prediction of Concrete Strength Properties
VENU MALAGAVELLI, GVR. SESHAGIRI RAO, G. SARAT RAJU, K. CH APPARAO and VENKATA PHANIKRISHNA BALAM
Abstract—Predicting the compressive strength of concrete is a complex task due to the heterogeneous nature of its mixture and the variability of its materials. Researchers have utilized machine learning and deep learning models to forecast the compressive strength of concrete for various mixes. This study focuses on predicting the compressive strength of fiber-reinforced concrete using boosting machine learning (BML) algorithms, including Gradient Boosting Regressor (GBR), AdaBoost Regressor, and Extreme Gradient Boosting (XGB). The performance of these BML models is assessed using metrics such as prediction accuracy (R2) and error rates, including Mean Squared Error (MSE), Root Mean Squared Error (RMSE), and Mean Absolute Error (MAE). Among the three models, the XGB model demonstrates the highest prediction accuracy, achieving an R2 of 0.89, along with the lowest error rates, with an MSE of 40.91 and an RMSE of 6.39 on the test dataset. In conclusion, the XGB model emerges as the most effective BML algorithm for predicting the compressive strength of concrete, delivering superior accuracy and minimal error.
Index Terms—Keywords: concrete, compressive strength, prediction, machine learning
Intelligent Fuzzy Logic Control Technique for Stiction Reduction in Pneumatic Control Valve
Pradeep Kumar Rohilla, Kinnera Rambabu, G Surya Prakash Rao, Suni Chintha, Varun Kumar
Abstract—The pneumatic control valve is a highly nonlinear component of the control loop that causes oscillations in the steady-state response and the limit cycle through nonlinearities like stiction.Control valve stiffness causes process variable oscillations, which lowers product quality and efficiency, accelerates wear and tear, and causes unsteadiness. In the paper, the stiction in a control valve is diagnosed using an intelligent
controller. This smart controller reduces stiction-based oscillations in process variables and controller output. It is called a fuzzy integral plus proportional derivative (Fuzzy I+PD). The Matlab/Simulink environment is used to design the controller. For setpoint following and disturbance rejection at various operating points, an execution comparison between fuzzy I+PD and PID controllers is conducted. When compared to a traditional PID controller, the experimental results show that the fuzzy I+PD controller performs better, effectively compensates for stiction based oscillations, and enhances closed-loop performance.
Index Terms—Keywords: Fuzzy Logic Control, Pneumatic Control Valve, Stiction Reduction
5G Antenna Design Using Machine Learning Algorithms
Mohit Pant, Rashmi Pant, Bhupendra Kumar
Abstract—Industry 4.0 is going to combine emerging technologies such as artificial intelligence, IoT, cloud computing, 5G and many mores. The 5G mobile communication is essential to industry 4.0 connectivity requirements such as low latency (less than 1ms), high data rate (20 Gbps), high reliability, high bandwidth (upto 400 MHz), and connected massive number of devices. The 5G also play a significant role in the advance- ment of crucial sectors such as healthcare for robotic surgery, defense for faster and secure communication, and smart cities for enabling drive-less cars. The crucial part of 5G mobile communication is millimeter wave antenna (mmWave). In this paper the 5G single input single output (SISO) antenna is design for 28 GHz millimeter wave (mmWave) wireless communication. The proposed antenna structure covers the mmWave 5G NR frequency bands n-257 (26.5-29.5 GHz) and n261 (27.5-28.35 GHz). The prediction of scattering parameters (S11) is achieved through machine learning (ML) models such as Gaussian process regression (GPR), Random forest regression (RFR), and Gradient booster (GBR) regression. The trained ML models take less time in return-loss prediction of 5G antenna at 28 GHz frequency and have low complexity as compared to traditional electromagnetic simulator software. The prediction of RFR model is marginally greater than other models. The root mean square error (RMSE) reported is 1.08.
Index Terms—Keywords: 5G, Antenna, Machine learning, Random forest.
Comparative Analysis of Pre-Trained Deep Learning Models for Rice Grain Classification
Mahesh Dembrani, Vinitkumar Patel, Anupkumar Jayaswal, Pratik Shah
Abstract—This study explores the effectiveness of several pretrained convolutional neural network (CNN) models for classifying rice
varieties from images. The models evaluated include Vision Transformer (ViT), ResNet50, MobileNet, EfficientNetB7, and VGG16.
The rice image dataset was processed, and the models’ performances were compared based on metrics such as accuracy, precision,
recall, and F1-score. To enhance computational efficiency and minimize overfitting, Principal Component Analysis (PCA) was applied
to reduce the dimensionality of the extracted feature vectors. Each model’s features were then classified using a Support Vector
Machine (SVM) classifier. The results indicated that EfficientNetB7-SVM achieved the best performance, with superior accuracy and
robustness across the evaluation metrics. Other models, including MobileNet-SVM, ResNet50-SVM, and ViT-SVM, also demonstrated
competitive results, while VGG16-SVM showed comparatively lower performance. The findings suggest that EfficientNetB7-SVM
provides an optimal balance between computational efficiency and classification accuracy, making it highly suitable for agricultural
applications. Finally, the paper discusses future research directions, such as model ensembling, data augmentation techniques, and
the deployment of real-time classification systems, to further improve rice variety classification and support precision agriculture
initiatives.
Index Terms—Rice Grain, ViT, VGG16, EfficientNetB7, ResNet50, MobileNet, SVM.
Design Optimization of Helical Spring Lock Washer
Hemant K. Wagh, Pandit S. Patil, Rohan R. Ozarkar, Pratik Shah
Paper Poster Presentation