Sanyam Bhutani e706b22e4d rebase to latest main 3 月之前
..
local_inference e706b22e4d rebase to latest main 3 月之前
mobile_inference e706b22e4d rebase to latest main 3 月之前
README.md e706b22e4d rebase to latest main 3 月之前

README.md

Quickstart > Inference

This folder contains scripts to get you started with inference on Meta Llama models.

  • Local Inference contains scripts to do memory efficient inference on servers and local machines
  • Mobile Inference has scripts using MLC to serve Llama on Android (h/t to OctoAI for the contribution!)