README.md 360 B

Quickstart > Inference

This folder contains scripts to get you started with inference on Meta Llama models.

  • Local Inference contains scripts to do memory efficient inference on servers and local machines
  • Mobile Inference has scripts using MLC to serve Llama on Android (h/t to OctoAI for the contribution!)