TransformerTorch
Implementing the Transformer architecture from scratch in PyTorch, and training a Transformer-based Neural Machine Translation (NMT) system.
VIEW ON GITHUBA Machine Learning Blog by Hooman Amini
Implementing the Transformer architecture from scratch in PyTorch, and training a Transformer-based Neural Machine Translation (NMT) system.
VIEW ON GITHUBNeural Machine Translation System for English–Spanish Using a RNN-Based Encoder–Decoder with Attention in PyTorch.
VIEW ON GITHUBTackling IMDb Reviews Sentiment Classification Using Transformer Fine‑Tuning (DeBERTaV3), BiGRU with GloVe Pretrained Embeddings, BiGRU with WordPiece, and Classic ML Baselines.
VIEW ON GITHUBCharacter‑Level Language Modeling for Text Generation in the Style of William Shakespeare Using a Multi‑Layer GRU Model with PyTorch.
VIEW ON GITHUBBach‑Style Music Generation Using an LSTM‑Based Model in TensorFlow, with HuggingFace Demo and Dockerized Deployment.
VIEW ON GITHUBA Lightweight, Production-Ready & End to End Sentiment Analysis Pipeline Based on IMDb Reviews, Featuring a Flask Web App, Dockerized Deployment, and a Hugging Face Spaces Online Demo.
VIEW ON GITHUBGaussian Mixture Models: From Theory to Generating Human-Like Faces with the LFW Dataset.
VIEW ON GITHUBImplementation of Linear Models from scratch with NumPy: Linear Regression (SVD, Gradient Descent variants: Batch, MiniBatch, Stochastic), Logistic Regression & Softmax Regression for Classification.
VIEW ON GITHUBI'm Hooman, a Data Scientist specializing in End-to-End Machine Learning systems, with an MSc in Systems Optimization and a strong foundation in Statistical Modeling & Data Mining.
Experienced across the full ML lifecycle; from designing Data Pipelines, training & fine-tuning custom models for specific business needs to designing & implemneting Agentic AI workflows in production environments.