No description
Find a file
kingbri ceb388e8a0 Start: Override ROCm env variables
These are used for supporting GPUs that are not on the "officially
supported list".

Signed-off-by: kingbri <bdashore3@proton.me>
2024-01-02 21:01:18 -05:00
.github/workflows feat: workflows for formatting/linting (#35) 2023-12-22 16:20:35 +00:00
loras Implement lora support (#24) 2023-12-08 23:38:08 -05:00
models Tree: Update documentation and configs 2023-11-16 02:30:33 -05:00
OAI API: Fix CFG reporting 2024-01-02 13:54:16 -05:00
templates Templates: Update folder 2023-12-18 23:53:47 -05:00
tests Tests: Remove logger class 2023-12-25 15:20:39 -05:00
.dockerignore Update Docker 2023-12-28 18:26:04 -05:00
.gitignore Update gitignore 2023-12-18 23:53:47 -05:00
.ruff.toml feat: workflows for formatting/linting (#35) 2023-12-22 16:20:35 +00:00
args.py Model: Add CFG support 2024-01-02 01:46:51 -05:00
auth.py feat: logging (#39) 2023-12-23 04:33:31 +00:00
config.py Config: Add override argparser 2024-01-01 14:27:12 -05:00
config_sample.yml API: Fix CFG reporting 2024-01-02 13:54:16 -05:00
docker-compose.yml Update Docker 2023-12-28 18:26:04 -05:00
Dockerfile Update Docker 2023-12-28 18:26:04 -05:00
formatting.bat feat: workflows for formatting/linting (#35) 2023-12-22 16:20:35 +00:00
formatting.sh feat: workflows for formatting/linting (#35) 2023-12-22 16:20:35 +00:00
gen_logging.py Model: Add CFG support 2024-01-02 01:46:51 -05:00
generators.py feat: workflows for formatting/linting (#35) 2023-12-22 16:20:35 +00:00
LICENSE Create LICENSE 2023-11-16 17:43:23 -05:00
logger.py feat: logging (#39) 2023-12-23 04:33:31 +00:00
main.py Main: Don't load if model_name/loras is blank 2024-01-02 13:56:25 -05:00
model.py Model: Add fallback if negative prompt is empty 2024-01-02 01:46:51 -05:00
README.md Update README 2023-12-31 11:25:18 -05:00
requirements-amd.txt Only try to install one of the EXLv2 wheels 2024-01-02 16:56:39 -08:00
requirements-cu118.txt Requirements: Update to Flash Attention 2.4.1 2023-12-25 14:40:08 -05:00
requirements-dev.txt feat: workflows for formatting/linting (#35) 2023-12-22 16:20:35 +00:00
requirements-nowheel.txt feat: logging (#39) 2023-12-23 04:33:31 +00:00
requirements.txt Requirements: Update to Flash Attention 2.4.1 2023-12-25 14:40:08 -05:00
start.bat Tree: Format and cleanup start 2023-12-27 01:17:31 -05:00
start.py Start: Override ROCm env variables 2024-01-02 21:01:18 -05:00
start.sh Start: Add shell script 2023-12-27 23:53:14 -05:00
TabbyAPI_Colab_Example.ipynb Colab: Expose new config arguments 2023-12-22 01:53:13 -08:00
templating.py API: Add template list endpoint 2023-12-29 22:58:55 -05:00
utils.py feat: logging (#39) 2023-12-23 04:33:31 +00:00

TabbyAPI

Note

Need help? Join the Discord Server and get the Tabby role. Please be nice when asking questions.

A FastAPI based application that allows for generating text using an LLM (large language model) using the Exllamav2 backend

Disclaimer

This API is still in the alpha phase. There may be bugs and changes down the line. Please be aware that you might need to reinstall dependencies if needed.

Help Wanted

Please check the issues page for issues that contributors can help on. We appreciate all contributions. Please read the contributions section for more details about issues and pull requests.

If you want to add samplers, add them in the exllamav2 library and then link them to tabbyAPI.

Getting Started

Read the Wiki for more information. It contains user-facing documentation for installation, configuration, sampling, API usage, and so much more.

Supported Model Types

TabbyAPI uses Exllamav2 as a powerful and fast backend for model inference, loading, etc. Therefore, the following types of models are supported:

  • Exl2 (Highly recommended)

  • GPTQ

  • FP16 (using Exllamav2's loader)

Alternative Loaders/Backends

If you want to use a different model type than the ones listed above, here are some alternative backends with their own APIs:

Contributing

If you have issues with the project:

  • Describe the issues in detail

  • If you have a feature request, please indicate it as such.

If you have a Pull Request

  • Describe the pull request in detail, what, and why you are changing something

Developers and Permissions

Creators/Developers: