## Quickstart > Inference This folder contains scripts to get you started with inference on Meta Llama models. * [Local Inference](./local_inference/) contains scripts to do memory efficient inference on servers and local machines