README.md 233 B

Quickstart > Inference

This folder contains scripts to get you started with inference on Meta Llama models.

  • Local Inference contains scripts to do memory efficient inference on servers and local machines