diff --git a/README.md b/README.md index 41dea4a..ca2471f 100644 --- a/README.md +++ b/README.md @@ -20,6 +20,26 @@ If you want to add samplers, add them in the [exllamav2 library](https://github. Read the [Wiki](https://github.com/theroyallab/tabbyAPI/wiki) for more information. It contains user-facing documentation for installation, configuration, sampling, API usage, and so much more. +## Supported Model Types + +TabbyAPI uses Exllamav2 as a powerful and fast backend for model inference, loading, etc. Therefore, the following types of models are supported: + +- Exl2 (Highly recommended) + +- GPTQ + +- FP16 (using Exllamav2's loader) + +#### Alternative Loaders/Backends + +If you want to use a different model type than the ones listed above, here are some alternative backends with their own APIs: + +- GGUF + GGML - [KoboldCPP](https://github.com/lostruins/KoboldCPP) + +- AWQ - [Aphrodite Engine](https://github.com/PygmalionAI/Aphrodite-engine) + +- [Text Generation WebUI](https://github.com/oobabooga/text-generation-webui) + ## Contributing If you have issues with the project: