tabbyAPI-ollama/endpoints/OAI/types
kingbri aa657fa6e9 API: Ignore add_bos_token in chat completions
When fetching special tokens from the model, don't factor in the
add_bos_token and ban_eos_token parameters as switches.

In addition, change the internal handling of add_bos_token to an optional
boolean. This allows us to fallback to the model when selecting whether
or not to add the BOS token, especially for chat completions.

Signed-off-by: kingbri <8082010+kingbri1@users.noreply.github.com>
2025-05-01 22:51:15 -04:00
..
chat_completion.py API: Ignore add_bos_token in chat completions 2025-05-01 22:51:15 -04:00
common.py Model + API: Migrate to use BaseSamplerParams 2025-04-16 00:50:05 -04:00
completion.py API: Fix finish_reason returns 2024-03-18 15:59:28 -04:00
embedding.py Embeddings: Fix base64 return 2025-01-01 16:15:12 -05:00
tools.py [WIP] OpenAI Tools Support/Function calling (#154) 2024-08-17 00:16:25 -04:00