Resource icon

WIP Proof of Concept script for local Ollama instance npc chat Alpha 0.1

No permission to download
Work In Progress
Tested on ServUO 57.3
Hello! There were two scripts by ParanoiaHD that enabled NPCs to query OpenAI servers for chat functionality. One bot did not work at all, and the other one I was actually able to spawn, but without OpenAI access, obviously did not generate any speech. This was an attempt to have the NPC query a local instance of a chat model running locally with Ollama for windows. This was tested on ServUO 57.3, and is by no means a proper script. I am not a programmer, but was able to successfully talk to the generated NPC ingame with this script. As with any LLM, you are able to give it instructions, and it will follow your prompts. It tends to generate a wall of text unless you request it keep the replies short and concise. This was tested with Llama3:Latest from Ollama website. Right now, the NPC can only chat. If there is a way to extend the functionality of NPCs by interfacing the Olamma API, please share your creations! Enjoy the chat!
First release
Last update
0.00 star(s) 0 ratings