Would it be more powerful if it could have instant search and long-term memory capabilities like Blenderbot 2.0?
Is it possible to use a plug-in to make it have instant search and long-term memory capabilities?
You can use Text-Gen-Webui to give it a memory up to 2048 tokens.
Is it possible to use the plug-in to achieve instant search?
I can't program.
Would it be more powerful if it could have instant search and long-term memory capabilities like Blenderbot 2.0?
Indeed, I believe it would. However:
Is it possible to use a plug-in to make it have instant search and long-term memory capabilities?
Unfortunately this can't really be a model "plug-in" - it would need to be an entirely different platform with many moving parts since it'd need to query a search engine, it'd need some sort of storage to save long-term memories in and so on.
Is there any way to make the model read character and other .json files at the same time?