Matthias Reso 98707b72fd Make gradio and langchain optional dependencies (#676) 6 months ago
..
code_llama 2f1cbfbbbf Merge remote-tracking branch 'upstream/main' into suraj-changes 9 months ago
local_inference 98707b72fd Make gradio and langchain optional dependencies (#676) 6 months ago
mobile_inference 4487513793 Updating the folder name 3p_integrations 9 months ago
README.md d17e678659 Add Llama 3.1 example upgrade script (#5) 8 months ago
modelUpgradeExample.py d17e678659 Add Llama 3.1 example upgrade script (#5) 8 months ago

README.md

Quickstart > Inference

This folder contains scripts to get you started with inference on Meta Llama models.

  • Code Llama contains scripts for tasks relating to code generation using CodeLlama
  • Local Inference contains scripts to do memory efficient inference on servers and local machines
  • Mobile Inference has scripts using MLC to serve Llama on Android (h/t to OctoAI for the contribution!)
  • Model Update Example shows an example of replacing a Llama 3 model with a Llama 3.1 model.