Suraj Subramanian b273a75a97 * Add new readmes 1 éve
..
README.md b273a75a97 * Add new readmes 1 éve

README.md

Quickstart > Inference

This folder contains scripts to get you started with inference on Meta Llama models.

  • [](./code_llama/) contains scripts for tasks relating to code generation using CodeLlama
  • [](./local_inference/) contsin scripts to do memory efficient inference on servers and local machines
  • [](./mobile_inference/) has scripts using MLC to serve Llama on Android (h/t to OctoAI for the contribution!)