## Quickstart > Inference This folder contains scripts to get you started with inference on Meta Llama models. * [Local Inference](./local_inference/) contains scripts to do memory efficient inference on servers and local machines * [Mobile Inference](./mobile_inference/) has scripts using MLC to serve Llama on Android (h/t to OctoAI for the contribution!)