Skip to content

Eric Transformer

Eric Transformer logo

Local pre-training, fine-tuning and inference for LLMs.

  • Format your text data in JSONL and then use a few lines of code to train models.
  • Full-parameter training of GPT-OSS-20b on a single H200.
  • Use Apple's new MLX-LM framework for fast inference. Run GPT-OSS-120b locally.
  • Enable RAG powered by Eric Search.
  • Local experiment tracking that displays charts and metrics.

Install

pip install erictransformer

Code: GitHub

Maintainers