We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLama3ChatSession example is said to "supports models such as llama3, llama2, phi3, qwen1.5, etc". However it fails due to several reasons:
<s>[INST] {{ system_prompt }} <</SYS>> {{ user_message_1 }} [/INST] {{ model_answer_1 }} </s> <s>[INST] {{ user_message_2 }} [/INST]
Llama2 doesn't provide chat template in metadata, so by default different template is used by llama.cpp:
<|im_start|>user hello<|im_end|> <|im_start|>assistant response<|im_end|> <|im_start|>user again<|im_end|> <|im_start|>assistant response<|im_end|>
As a result, conversation using LLama3 example after fixing NullRefence exception looks like this:
It would be nice to have an example that uses custom HistoryTransform for llama2
The text was updated successfully, but these errors were encountered:
Successfully merging a pull request may close this issue.
Description
LLama3ChatSession example is said to "supports models such as llama3, llama2, phi3, qwen1.5, etc".
However it fails due to several reasons:
Llama2 doesn't provide chat template in metadata, so by default different template is used by llama.cpp:
As a result, conversation using LLama3 example after fixing NullRefence exception looks like this:
It would be nice to have an example that uses custom HistoryTransform for llama2
The text was updated successfully, but these errors were encountered: