Support more common tool variables in templates (tools, message.tool_calls) (#308)
* Add non-JSON version of `tools` and `functions` to `template_vars`. Increase the compatibility with VLLM templates which use a non-JSON tools object. * Add list of tool template variables to the documentation * Use Jinja templates to provide `tools_json` and `functions_json` This should be functionally equivelant, but the JSON won't be produced unless it's needed. * Make message.tool_calls match the JSON from ToolCallProcessor * Log something when generating tool calls * Add template for Qwen QwQ 32b * Only log if tool calls have been detected * API: Fix tool call variable assignments Jinja functions do not run when variables are called. Use json.dumps instead. In addition, log the request ID when stating that a tool call was fired. Signed-off-by: kingbri <8082010+kingbri1@users.noreply.github.com> * Add `ToolCallProcessor.dump()` to get the list of processed dicts * Remove qwen_qwq_32b.jinja This will be added to the following repository at a later date: https://github.com/theroyallab/llm-prompt-templates --------- Signed-off-by: kingbri <8082010+kingbri1@users.noreply.github.com> Co-authored-by: kingbri <8082010+kingbri1@users.noreply.github.com>
This commit is contained in:
parent
ccf23243c1
commit
436ce752da
3 changed files with 44 additions and 10 deletions
|
|
@ -37,6 +37,11 @@ For example, if you are using a Llama 3.1 Family model you can simply modify you
|
|||
|
||||
If loading via `/v1/model/load`, you would also need to specify a tool-supporting `prompt_template`.
|
||||
|
||||
## Tool Template Variables
|
||||
|
||||
- `tools`: Tools object.
|
||||
- `tools_json`: Tools object as a JSON string.
|
||||
|
||||
## Creating a Tool Calling Prompt Template
|
||||
|
||||
Here's how to create a TabbyAPI tool calling prompt template:
|
||||
|
|
@ -142,4 +147,4 @@ When creating your own tool calling `prompt_template`, it's best to reference th
|
|||
|
||||
## Support and Bug Reporting
|
||||
|
||||
For bugs, please create a detailed issue with the model, prompt template, and conversation that caused it. Alternatively, join our [Discord](https://discord.gg/sYQxnuD7Fj) and ask for Storm.
|
||||
For bugs, please create a detailed issue with the model, prompt template, and conversation that caused it. Alternatively, join our [Discord](https://discord.gg/sYQxnuD7Fj) and ask for Storm.
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue