← Back to Platform

Gerador de Palavras

A tiny GPT (4,608 parameters) trained on 43K Portuguese words

Based on Andrej Karpathy's MicroGPT — same architecture as ChatGPT, just smaller

0.5
conservador criativo

Loading model weights...

Como funciona?

This is a real neural network running in your browser — not a random letter shuffler. It's a GPT transformer (the same architecture behind ChatGPT) with:

  • 4,608 learned parameters
  • 1 transformer layer, 4 attention heads
  • 16-dimensional embeddings, 16-token context
  • Trained on 43,532 Portuguese words (2,000 steps)

The model learned Portuguese phonotactics — which character sequences are natural in Portuguese (consonant clusters like nh, lh, ção; vowel patterns with accents; common endings like -mente, -eiro, -ação). It generates new words character by character, picking the next letter based on learned probabilities.

Temperature controls randomness: low = safe, common patterns; high = creative, unusual combinations.

Based on Andrej Karpathy's MicroGPT (Feb 2026). Pre-trained in Python, inference ported to JavaScript.