Hamid Shojanazeri c21c702493 fix the PR link 1 year ago
..
hf-text-generation-inference b3067b55dc fix typos and spelling errors 1 year ago
README.md c21c702493 fix the PR link 1 year ago
chat_completion.py 44ef280d31 adding flash attention and xformer memory efficient through PT SDPA 1 year ago
chat_utils.py fb26d74293 applyin prompt updates 1 year ago
chats.json fb26d74293 applyin prompt updates 1 year ago
checkpoint_converter_fsdp_hf.py 50e9d17045 add the default option for find the HF model_name/path from train_param.yaml 1 year ago
inference.py 44ef280d31 adding flash attention and xformer memory efficient through PT SDPA 1 year ago
model_utils.py 76a187c4d2 clean up 1 year ago
safety_utils.py 4767f09ecd Initial commit 1 year ago
samsum_prompt.txt 4767f09ecd Initial commit 1 year ago
vLLM_inference.py 4767f09ecd Initial commit 1 year ago

README.md

Inference

This folder contains inference examples for Llama 2. So far, we have provided support for three methods of inference:

  1. inference script script provides support for Hugging Face accelerate, PEFT and FSDP fine tuned models.

  2. vLLM_inference.py script takes advantage of vLLM's paged attention concept for low latency.

  3. The hf-text-generation-inference folder contains information on Hugging Face Text Generation Inference (TGI).

For more in depth information on inference including inference safety checks and examples, see the inference documentation here.

System Prompt Update

Observed Issue

We received feedback from the community on our prompt template and we are providing an update to reduce the false refusal rates seen. False refusals occur when the model incorrectly refuses to answer a question that it should, for example due to overly broad instructions to be cautious in how it provides responses.

Updated approach

Based on evaluation and analysis, we recommend the removal of the system prompt as the default setting. Pull request #104] removes the system prompt as the default option, but still provides an example to help enable experimentation for those using it.