Here are some of the projects I've worked on. Most are written either to solve my own problems or just for the fun of building things. Check out my GitHub for more.
Showing 3 projects

Designing a full-stack, LLM-powered "digital twin" that learns an individual's writing style from their email history, then generates emails in their voice. Architecting a modular feature–training–inference pipeline with React frontend, TypeScript backend, and vector-based RAG retrieval to support future fine-tuning of personalized language models
STACK

Built a cattle breed classification model using ResNet-18, achieving 95% accuracy and strong precision/recall across 8 breeds. Addressed class imbalance in the skewed dataset by applying class-weighting (assigning higher weights to underrepresented classes). Trained the model over multiple epochs for generalization before fully fine-tuning, leading to robust performance despite data imbalance.
STACK

A project which involves my effort to understand the intuition/logic behind research papers as well as the maths involved in these papers followed by code implementations.
STACK