Update README.md
Browse filesAdd function calling configuration
README.md
CHANGED
@@ -38,7 +38,7 @@ license: apache-2.0
|
|
38 |
# OpenVINO IR model with int8 quantization of Hermes-2-Theta-Llama-3-8B
|
39 |
|
40 |
Model definition for LocalAI:
|
41 |
-
```
|
42 |
name: hermes-2-Theta-llama3
|
43 |
backend: transformers
|
44 |
parameters:
|
@@ -49,6 +49,74 @@ template:
|
|
49 |
use_tokenizer_template: true
|
50 |
```
|
51 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
52 |
|
53 |
To run the model directly with LocalAI:
|
54 |
```
|
|
|
38 |
# OpenVINO IR model with int8 quantization of Hermes-2-Theta-Llama-3-8B
|
39 |
|
40 |
Model definition for LocalAI:
|
41 |
+
```yaml
|
42 |
name: hermes-2-Theta-llama3
|
43 |
backend: transformers
|
44 |
parameters:
|
|
|
49 |
use_tokenizer_template: true
|
50 |
```
|
51 |
|
52 |
+
LocalAI configuration for function calling:
|
53 |
+
```yaml
|
54 |
+
name: hermes-2-Theta-llama3
|
55 |
+
backend: transformers
|
56 |
+
parameters:
|
57 |
+
model: fakezeta/Hermes-2-Theta-Llama-3-8B-ov-int8
|
58 |
+
context_size: 8192
|
59 |
+
type: OVModelForCausalLM
|
60 |
+
function:
|
61 |
+
# disable injecting the "answer" tool
|
62 |
+
disable_no_action: true
|
63 |
+
# This allows the grammar to also return messages
|
64 |
+
grammar_message: true
|
65 |
+
# Suffix to add to the grammar
|
66 |
+
grammar_prefix: '<tool_call>\n'
|
67 |
+
return_name_in_function_response: true
|
68 |
+
# Without grammar uncomment the lines below
|
69 |
+
# Warning: this is relying only on the capability of the
|
70 |
+
# LLM model to generate the correct function call.
|
71 |
+
# no_grammar: true
|
72 |
+
# json_regex_match: "(?s)<tool_call>(.*?)</tool_call>"
|
73 |
+
replace_results:
|
74 |
+
"<tool_call>": ""
|
75 |
+
"\'": "\""
|
76 |
+
|
77 |
+
template:
|
78 |
+
chat_message: |
|
79 |
+
<|im_start|>{{if eq .RoleName "assistant"}}assistant{{else if eq .RoleName "system"}}system{{else if eq .RoleName "tool"}}tool{{else if eq .RoleName "user"}}user{{end}}
|
80 |
+
{{- if .FunctionCall }}
|
81 |
+
<tool_call>
|
82 |
+
{{- else if eq .RoleName "tool" }}
|
83 |
+
<tool_response>
|
84 |
+
{{- end }}
|
85 |
+
{{- if .Content}}
|
86 |
+
{{.Content }}
|
87 |
+
{{- end }}
|
88 |
+
{{- if .FunctionCall}}
|
89 |
+
{{toJson .FunctionCall}}
|
90 |
+
{{- end }}
|
91 |
+
{{- if .FunctionCall }}
|
92 |
+
</tool_call>
|
93 |
+
{{- else if eq .RoleName "tool" }}
|
94 |
+
</tool_response>
|
95 |
+
{{- end }}<|im_end|>
|
96 |
+
# https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B-GGUF#prompt-format-for-function-calling
|
97 |
+
function: |
|
98 |
+
<|im_start|>system
|
99 |
+
You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions. Here are the available tools:
|
100 |
+
<tools>
|
101 |
+
{{range .Functions}}
|
102 |
+
{'type': 'function', 'function': {'name': '{{.Name}}', 'description': '{{.Description}}', 'parameters': {{toJson .Parameters}} }}
|
103 |
+
{{end}}
|
104 |
+
</tools>
|
105 |
+
Use the following pydantic model json schema for each tool call you will make:
|
106 |
+
{'title': 'FunctionCall', 'type': 'object', 'properties': {'arguments': {'title': 'Arguments', 'type': 'object'}, 'name': {'title': 'Name', 'type': 'string'}}, 'required': ['arguments', 'name']}
|
107 |
+
For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
|
108 |
+
<tool_call>
|
109 |
+
{'arguments': <args-dict>, 'name': <function-name>}
|
110 |
+
</tool_call><|im_end|>
|
111 |
+
{{.Input -}}
|
112 |
+
<|im_start|>assistant
|
113 |
+
<tool_call>
|
114 |
+
chat: |
|
115 |
+
{{.Input -}}
|
116 |
+
<|im_start|>assistant
|
117 |
+
completion: |
|
118 |
+
{{.Input}}
|
119 |
+
```
|
120 |
|
121 |
To run the model directly with LocalAI:
|
122 |
```
|