tabbyAPI-ollama/endpoints
kingbri 79f9c6e854 Model: Remove num_experts_per_token
This shouldn't even be an exposed option since changing it always
breaks inference with the model. Let the model's config.json handle
it.

Signed-off-by: kingbri <8082010+kingbri1@users.noreply.github.com>
2025-03-19 11:52:10 -04:00
..
core Model: Remove num_experts_per_token 2025-03-19 11:52:10 -04:00
Kobold Dependencies: Update sse-starlette and formatron 2024-12-21 23:14:55 -05:00
OAI Bugfix: Chat completion requests fail with UnboundLocalError: finish_reason variable not initialized (#307) 2025-03-15 20:31:21 -04:00
server.py Args: Expose api-servers to subcommands 2025-02-10 23:39:46 -05:00