## Quickstart > Inference This folder contains scripts to get you started with inference on Meta Llama models. * [Code Llama](./code_llama/) contains scripts for tasks relating to code generation using CodeLlama * [Local Inference](./local_inference/) contains scripts to do memory efficient inference on servers and local machines * [Mobile Inference](./mobile_inference/) has scripts using MLC to serve Llama on Android (h/t to OctoAI for the contribution!) * [Model Update Example](./modelUpgradeExample.py) shows an example of replacing a Llama 3 model with a Llama 3.1 model.