Sanyam Bhutani bd210b105d Fix A LOT of links 4 months ago
..
local_inference bd210b105d Fix A LOT of links 4 months ago
mobile_inference bd210b105d Fix A LOT of links 4 months ago
README.md ae010af7d8 move and add Difflog 4 months ago

README.md

Quickstart > Inference

This folder contains scripts to get you started with inference on Meta Llama models.

  • Code Llama contains scripts for tasks relating to code generation using CodeLlama
  • Local Inference contains scripts to do memory efficient inference on servers and local machines
  • Mobile Inference has scripts using MLC to serve Llama on Android (h/t to OctoAI for the contribution!)
  • Model Update Example shows an example of replacing a Llama 3 model with a Llama 3.1 model.