A tiny GPT (4,608 parameters) trained on 43K Portuguese words
Based on Andrej Karpathy's MicroGPT — same architecture as ChatGPT, just smaller
Loading model weights...
This is a real neural network running in your browser — not a random letter shuffler. It's a GPT transformer (the same architecture behind ChatGPT) with:
The model learned Portuguese phonotactics — which character sequences are natural in Portuguese (consonant clusters like nh, lh, ção; vowel patterns with accents; common endings like -mente, -eiro, -ação). It generates new words character by character, picking the next letter based on learned probabilities.
Temperature controls randomness: low = safe, common patterns; high = creative, unusual combinations.
Based on Andrej Karpathy's MicroGPT (Feb 2026). Pre-trained in Python, inference ported to JavaScript.