AI Developer & Database Engineer
From designing cyber security databases to building LLM-powered platforms, my focus is on creating systems that think as clearly as they scale. You could say I query the future.
Say hello to Peyton.ai—my LLM-powered resume chatbot experiment. It started off as an implementation of a tiny MoE model trained from scratch, now it's RAG attached to the dataset to accurately answer questions about my work experience. Hosted on my locally distributed GPU cluster.
You have questions? I've got answers.
Mixing Gated Linear Recurrences with
Local Attention for Efficient Language Models.
Paper implementation
Transformer model that reports confidence in its predictions.
Fast and efficient Text-to-Speech model.
Exploring memory mechanisms for Large Language Models at scale.
A straightforward implementation of the Vision Transformer.
A straightforward implementation of CLIP.
A small Mixture of Experts language model.
Transformer model focused on conversational AI.
Transformer model for predicting sequences of actions.
Combining YOLOv5 object detection with Deep Q-Learning.
Multi Input-Multi Output Transformer models with early exit
Paper implementation
Speech-to-speech translation model based on T5.
I used to just build databases. Now, I train the AI that speaks from them. Code, data, and curiosity—that's my stack.
I started off as a nuclear machinist's mate for the Navy, now I'm a developer and database engineer with 5+ years who builds intelligent systems with structured data and large language models. Whether it's designing resilient databases, fine-tuning LLMs, or developing unique AI systems, I bring structure and intelligence together to create tools that think, respond, and evolve. Let's make machines a little smarter and a lot more useful.