Matthias Reso 98707b72fd Make gradio and langchain optional dependencies (#676) 6 bulan lalu
..
code_llama 2f1cbfbbbf Merge remote-tracking branch 'upstream/main' into suraj-changes 9 bulan lalu
local_inference 98707b72fd Make gradio and langchain optional dependencies (#676) 6 bulan lalu
mobile_inference 4487513793 Updating the folder name 3p_integrations 9 bulan lalu
README.md d17e678659 Add Llama 3.1 example upgrade script (#5) 8 bulan lalu
modelUpgradeExample.py d17e678659 Add Llama 3.1 example upgrade script (#5) 8 bulan lalu

README.md

Quickstart > Inference

This folder contains scripts to get you started with inference on Meta Llama models.

  • Code Llama contains scripts for tasks relating to code generation using CodeLlama
  • Local Inference contains scripts to do memory efficient inference on servers and local machines
  • Mobile Inference has scripts using MLC to serve Llama on Android (h/t to OctoAI for the contribution!)
  • Model Update Example shows an example of replacing a Llama 3 model with a Llama 3.1 model.