Independent AI researcher. No lab, no supervisor — just a machine and a research agenda.
My work sits at the intersection of neural signal processing, non-standard data modalities, and ML systems under real hardware constraints. I compete in ML challenges as a forcing function for depth, not rankings.
Recent work also includes IMAGINE, a cross-domain MEG decoding pipeline for mental imagery that combines localizer-to-imagery transfer, temporal search, and complementary MEG feature branches.
Signal Processing → MEG/EEG · Riemannian geometry · CSP · Xdawn · MNE
Deep Learning → PyTorch · Transformers · Diffusion · EEGNet
Classical ML → XGBoost · LightGBM · Stacking · Bayesian optimization
| Project | What it is | |
|---|---|---|
| 🧠 | IMAGINE | Cross-domain MEG decoding of mental imagery with LDA + prototype matching, built as a research-style pipeline |
| 🌌 | MALLORN | TDE detection from astronomical lightcurves — physics-informed features + semi-supervised ensembling |
| ♟ | Chess Transformer Engine | Move prediction via self-attention trained on 2M PGN games |
| 🔥 | Wildfire Monitoring | Multi-stage fire prediction pipeline integrating satellite + sensor data |
Before AI research, I spent several years in game development — building games and systems in Unity with C#. That background shaped how I think about real-time systems, optimization under constraints, and building things that actually run on real hardware. It's also where I developed an intuition for debugging complex, stateful systems — which transfers directly into ML pipelines.
Brain-Computer Interfaces Neuroscience-inspired architectures Predictive coding Non-standard modalities Continual learning Small-model optimization Astronomical ML