Update documentation

Signed-off-by: kingbri <bdashore3@proton.me>
This commit is contained in:
kingbri 2023-12-05 00:33:43 -05:00
parent 8ba3bfa6b3
commit 621e11b940
2 changed files with 2 additions and 4 deletions

View file

@ -18,9 +18,9 @@ To get started, make sure you have the following installed on your system:
- Python 3.x (preferably 3.11) with pip
- CUDA 12.1 or 11.8 (or ROCm 5.6)
- CUDA 12.x (you can also use CUDA 11.8 or ROCm 5.6, but there will be more work required to install dependencies such as Flash Attention 2)
NOTE: For Flash Attention 2 to work on Windows, CUDA 12.1 **must** be installed!
NOTE: For Flash Attention 2 to work on Windows, CUDA 12.x **must** be installed!
## Installing

View file

@ -6,8 +6,6 @@ uvicorn
# Wheels
# Flash Attention 2. If the wheels don't work, comment these out, uncomment the old wheels, and run `pip install -r requirements.txt`
# Windows FA2 from https://github.com/jllllll/flash-attention/releases
https://github.com/jllllll/flash-attention/releases/download/v2.3.6/flash_attn-2.3.6+cu121torch2.1cxx11abiFALSE-cp311-cp311-win_amd64.whl; platform_system == "Windows" and python_version == "3.11"
https://github.com/jllllll/flash-attention/releases/download/v2.3.6/flash_attn-2.3.6+cu121torch2.1cxx11abiFALSE-cp310-cp310-win_amd64.whl; platform_system == "Windows" and python_version == "3.10"