瀏覽代碼

Add note on CUDA version + remove 'test' from pytorch whl url (#419)

Suraj Subramanian 11 月之前
父節點
當前提交
4f57adae11
共有 1 個文件被更改,包括 4 次插入0 次删除
  1. 4 0
      README.md

+ 4 - 0
README.md

@@ -64,6 +64,10 @@ If you want to use PyTorch nightlies instead of the stable release, go to [this
 ### Installing
 Llama-recipes provides a pip distribution for easy install and usage in other projects. Alternatively, it can be installed from source.
 
+> [!NOTE]
+> Ensure you use the correct CUDA version (from `nvidia-smi`) when installing the PyTorch wheels. Here we are using 11.8 as `cu118`.
+> H100 GPUs work better with CUDA >12.0
+
 #### Install with pip
 ```
 pip install llama-recipes