Passionate about using AI and machine learning to solve complex problems. I build intelligent, scalable systems by analyzing data and applying deep learning, computer vision, NLP, and predictive analytics.
NMIMS University
August 2020 - May 2024
CGPA: 3.42/4.0
Coursera - Google Certification
Jan 2026
HackerRank
Nov 2025
McKinsey & Company
Dec 2025
Designed and implemented a parameter-efficient fine-tuning (PEFT) pipeline for the open-source LLaMA-3.2-1B Instruct model using an Alpaca-format instruction dataset, enabling effective instruction-following adaptation while minimizing GPU memory usage and training cost.
Addressed the challenge of automated kidney tumour detection from medical images by building an end-to-end CNN based classification system to assist accurate and timely diagnosis, achieving 88.26% validation accuracy under hardware constraints.
This project develops a machine learning based credit default prediction system to help financial institutions assess borrower risk. After data cleaning, preprocessing, Exploratory Data analysis, and Hyperparameter tuning, the XGBoost classifier achieved the best results, reaching 93.84% accuracy and an AUC score of 0.86.
Developed an end-to-end multilingual document translation system using using the llama-4-maverick-17b-instruct model via the Groq API and Streamlit UI, enabling bidirectional translation between Indian regional languages for both PDF and DOCX files.
Recreated the TinyVGG deep learning architecture using PyTorch and torchvision to perform image classification on the FashionMNIST dataset. Designed and implemented a custom Convolutional Neural Network (CNN) consisting of Conv2d, ReLU, and MaxPool2d layers to extract hierarchical visual features.
Performed extensive data cleaning, preprocessing including handling missing values, applying Min-Max scaling, and using One-Hot Encoding for categorical variables and feature engineering, including time-based and geospatial features (using the Haversine formula for distance).After hyperparameter tuning the XGBoost Regressor outperformed all achieving an RMSE of 3.25 on Kaggle, placing the model in the top 30% of the Kaggle leaderboard.
NN & Sons
September 2024 - January 2025
Ernst & Young
July 2024 - September 2025
Ernst & Young
Jan 2024 - July 2024
I'm always open to discussing new projects, creative ideas, or opportunities to be part of your vision.
Dadra & Nagar Haveli, India