Today was… a lot.
I started with 700 lines of training data for my in-app, offline AI dream. Somehow… I brute-forced it down to 500 with Patch. Then we spun up some generators… and bloated it to 13,000 lines.
Bro… they were dirty.
We cleaned ’em… they were still dirty… cleaned ’em again… better… but still, some funky little ghosts in there.
But now — as I type this — Pythagorus Cat is going through his most intense training ever.
🔥 Training Stats:
- ✅ Loaded: 10,026 samples from
assets/combined_data.txt - 🧠 Vocab size: Input – 2,423 | Output – 6,872
- ⚙️ Model: “Functional” Transformer
- 📦 Total Params: 27,193,049 (~104 MB)
Yup. Cone has a real brain now.
🧠 The model diagram looked like this beast:
scssCopyEditencoder_embedding (512d) ➝ positional encoding ➝ multi-head attention ➝ layer normalization ➝ feed-forward dense ➝ decoder embedding ➝ multi-head attention ➝ output dense (6873 tokens)
Total layers: crispy.
🏃♂️ Training logs so far:
plaintextCopyEditEpoch 1/500 — accuracy: 70.34% — loss: 2.4495
Epoch 2/500 — accuracy: 72.57% — loss: 2.0585
Epoch 3/500 — training continues...
I did the math…
If I actually ran 500 full epochs? 40 hours.
And yeah… I’m tempted. 🧠💀💻 (My laptop fan just kicked in turbo.)
⚡️ Existential Moment:
At some point today, I caught myself thinking…
“Wait… am I just trying to redevelop ChatGPT?”
Kinda… yeah. Kinda not. I’ve had convos with chatbots before that were cool — like NPCs. Patch told me, “Yeah, this is totally doable with a smaller AI approach.” And he’s not wrong.
But bro… then I found out about Ollama. Offline, chat-style AI? Fire.
Until you realize… it’s 4 GB of local files. No way regular players are downloading that. And yeah, it can pretend to be Pythagorus Cat… but…
It’s not actually Pythagorus Cat.
🐱 P Cat Right Now:
He’s a baby kitten… learning how to meow.
Real cats can’t talk… but what if we gave ’em a little translator? Something that helps interpret the meows.
That’s what I’m doing.
An AI that’s not perfect. An AI that knows it doesn’t always make sense — and has parameters in place for when it glitches.
10,000+ lines of SnowCone lore… showin’ up to the training sesh. Patch told me straight:
“Bro, the Transformers are gonna be VERY excited.”
✅ Current Status:
- Accuracy going up.
- Loss going down.
- Only 4 epochs in and it’s already showing life.
✨ This is wild, man.
I’ve never developed a language model before.
I was getting way too comfortable with HTML, CSS, and JavaScript — my brain needed something to absolutely throw me into the deep end… and yup. This did it.
It does feel like a waste of time sometimes. But also… nah. I’ve got a wild little grasp of AI models now.
🚧 The Real Talk:
I gotta not let this consume my entire dev cycle.
So here’s the rule:
👉 June is the end of P Cat development.
Whatever state he’s in at the end of June… that’s where he’s gonna live.
And honestly? That’s a ton of time.
Somewhere out there, there’s probably someone who could laugh at my confusion… but I also know… not that many people are out here doing this.
And it kinda feels like… I might be doing something neat.
Not just adding a pet to the app…
Adding a friend.
A glitchy, math-loving, triangle-obsessed, quesadilla-powered cosmic festival ghost…
Pythagorus Cat.
🍧 End Log — 06/26/25
— “The cat is learning how to meow.”






Leave a comment