hf-llm-api / networks /huggingchat_streamer.py

Commit History

:zap: [Enhance] Quieter openai auth, use cffi to request hf-chat id, and console token count for exceeds
0f710a2

Hansimov commited on

:boom: [Fix] Value of (delta_)content or content_type not assigned
b40b5fc

Hansimov commited on

:gem: [Feature] Moduralize TokenChecker, and fix gated model repos with alternatives
8df3985

Hansimov commited on

:gem: [Feature] HuggingchatStreamer: Support no-stream mode
caedafb

Hansimov commited on

:gem: [Feature] HuggingchatStreamer: Enable chat_return_generator
62d5db7

Hansimov commited on

:gem: [Feature] New TokenChecker: count tokens and check token limit
64105c5

Hansimov commited on

:gem: [Feature] Split to Requester and Streamer, and mock chat history with messages
c43287d

Hansimov commited on

:gem: [Feature] Enable chat response with get_message_id
e4d11b8

Hansimov commited on

:hammer: [WIP] Enabling get_message_id
a5ac953

Hansimov commited on

:gem: [Feature] HuggingchatStreamer: Build pipeline of chat_response
55b0c51

Hansimov commited on

:gem: [Feature] HuggingchatStreamer: New log request and response
c706328

Hansimov commited on

:gem: [Feature] HuggingchatStreamer: New get_hf_chat_id, and improve get_conversation_id
391cdfe

Hansimov commited on

:gem: [Feature] New HuggingchatStreamer: get_conversation_id
f1218fc

Hansimov commited on