Browse Source

updated readme

Justin Lee 3 months ago
parent
commit
0bec41f86a
1 changed files with 55 additions and 19 deletions
  1. 55 19
      end-to-end-use-cases/prompt-migration/README.md

+ 55 - 19
end-to-end-use-cases/prompt-migration/README.md

@@ -1,8 +1,9 @@
+
 # Prompt Migration
 
 ## Overview
 
-The prompt migration toolkit helps you assess and adapt prompts across different language models, ensuring consistent performance and reliability. It includes benchmarking capabilities and evaluation tools to measure the effectiveness of prompt migrations.
+The **Prompt Migration** toolkit helps you assess and adapt prompts across different language models, ensuring consistent performance and reliability. It includes benchmarking capabilities and evaluation tools to measure the effectiveness of prompt migrations.
 
 ## Project Structure
 
@@ -11,31 +12,66 @@ The prompt migration toolkit helps you assess and adapt prompts across different
 - `benchmarks/`: Tools and scripts for performance evaluation
 - `environment.yml`: Conda environment specification with all required dependencies
 
-## Setup Instructions
+## Prerequisites
+
+1. **Conda Environment**
+   - [Miniconda](https://docs.conda.io/en/latest/miniconda.html) or [Anaconda](https://www.anaconda.com/) installed
+   - Python 3.10
+   - Create and activate the environment:
+     ```bash
+     conda env create -f environment.yml
+     conda activate prompt-migration
+     ```
+
+2. **Setting Up vLLM for Inference**
+   If you plan to use [vLLM](https://github.com/vllm-project/vllm) for model inference:
+   ```bash
+   pip install vllm
+   ```
+   To serve a large model (example: Meta’s Llama 3.3 70B Instruct), you might run:
+   ```bash
+   vllm serve meta-llama/Llama-3.3-70B-Instruct --tensor-parallel-size=2
+   ```
+   Adjust the model name and `--tensor-parallel-size` according to your hardware and parallelization needs.
 
-1. Install dependencies using Conda:
-```bash
-conda env create -f environment.yml
-conda activate prompt-migration
-```
+3. **Accessing Hugging Face Datasets**
+   If you need to work with private or gated Hugging Face datasets, follow these steps:
+   1. **Create a Hugging Face account (if you don’t have one):**
+      Visit [Hugging Face](https://huggingface.co/) and create an account.
+   2. **Authenticate via the Hugging Face CLI:**
+      - Install the Hugging Face Hub CLI:
+        ```bash
+        pip install huggingface_hub
+        ```
+      - Log in to Hugging Face:
+        ```bash
+        huggingface-cli login
+        ```
+      - Enter your Hugging Face credentials (username and token). You can generate or retrieve your token in your [Hugging Face settings](https://huggingface.co/settings/tokens).
+   3. **Check Dataset Permissions:**
+      Some datasets may require explicit permission from the dataset owner. If you continue to have access issues, visit the dataset page on Hugging Face to request or confirm your access rights.
 
 ## Key Dependencies
 
-- Python 3.10
-- DSPy: For prompt engineering and evaluation
-- LM-eval: Evaluation framework for language models
-- PyTorch and Transformers: For model inference
+- **DSPy**: For prompt engineering and evaluation
+- **LM-eval**: Evaluation framework for language models
+- **PyTorch** and **Transformers**: For model inference
 
 ## Getting Started
 
-1. Activate your environment using Conda as described above
-2. Start Jupyter notebook server:
-```bash
-jupyter notebook
-```
-3. Navigate to the `notebooks/harness.ipynb` notebook in your browser
-4. Use the benchmarking tools in the `benchmarks/` directory to evaluate your migrations
+1. **Activate your environment:**
+   ```bash
+   conda activate prompt-migration
+   ```
+2. **Start Jupyter notebook server:**
+   ```bash
+   jupyter notebook
+   ```
+3. **Open the main notebook:**
+   Navigate to the `notebooks/harness.ipynb` in your browser to get started.
+4. **Explore Benchmarks:**
+   Use the scripts in the `benchmarks/` directory to evaluate your prompt migrations.
 
 ## License
 
-This project is part of the Llama Recipes collection. Please refer to the main repository's license for usage terms.
+This project is part of the **Llama Recipes** collection. Please refer to the main repository’s license for usage terms.