tabbyAPI-ollama/endpoints/OAI
kingbri 2d89c96879 API: Re-add BOS token stripping in template render
Matching YALS, if the model has add_bos_token enabled, then remove
an extra BOS token at the start of the prompt. This usually happens
with misconfigured templates such as Llama 3.

Signed-off-by: kingbri <8082010+kingbri1@users.noreply.github.com>
2025-05-24 21:11:53 -04:00
..
types API: Fix typing for chat templates in CC requests 2025-05-24 21:06:05 -04:00
utils API: Re-add BOS token stripping in template render 2025-05-24 21:11:53 -04:00
router.py API: Fix chat completion formatting flow 2024-11-21 17:51:14 -05:00