tabbyAPI-ollama/endpoints/core
kingbri 79f9c6e854 Model: Remove num_experts_per_token
This shouldn't even be an exposed option since changing it always
breaks inference with the model. Let the model's config.json handle
it.

Signed-off-by: kingbri <8082010+kingbri1@users.noreply.github.com>
2025-03-19 11:52:10 -04:00
..
types Model: Remove num_experts_per_token 2025-03-19 11:52:10 -04:00
utils API: Add inline exception for dummy models 2024-11-17 21:15:45 -05:00
router.py Endpoints: Add props endpoint and add more values to model params 2024-12-26 17:32:19 -05:00