Update README

Signed-off-by: kingbri <bdashore3@proton.me>
This commit is contained in:
kingbri 2023-12-31 11:25:18 -05:00
parent 979a9d28a3
commit 7176fa66f0

View file

@ -20,6 +20,26 @@ If you want to add samplers, add them in the [exllamav2 library](https://github.
Read the [Wiki](https://github.com/theroyallab/tabbyAPI/wiki) for more information. It contains user-facing documentation for installation, configuration, sampling, API usage, and so much more.
## Supported Model Types
TabbyAPI uses Exllamav2 as a powerful and fast backend for model inference, loading, etc. Therefore, the following types of models are supported:
- Exl2 (Highly recommended)
- GPTQ
- FP16 (using Exllamav2's loader)
#### Alternative Loaders/Backends
If you want to use a different model type than the ones listed above, here are some alternative backends with their own APIs:
- GGUF + GGML - [KoboldCPP](https://github.com/lostruins/KoboldCPP)
- AWQ - [Aphrodite Engine](https://github.com/PygmalionAI/Aphrodite-engine)
- [Text Generation WebUI](https://github.com/oobabooga/text-generation-webui)
## Contributing
If you have issues with the project: