Search code examples
c#consolelarge-language-modelllamaollama

Ollama not saving anything in context


I am messing around with ollama in C# coding, and managed to get it to give me output, and even interacting with my code, however, Using the example they provided, I am running into an issue where the LLM is not retaining the previous conversations, and i ended up with the sloppy method of injecting the instructions alongside my prompts.

Here is my Code:

public async void Prompt(string prompt)
{
    output = "";
    string input = instructions + prompt;
    context = await ollama.StreamCompletion(input, context, stream => output += (stream.Response));
    Console.WriteLine(output);
}

Output is a string, I might change this to something that can save more variables later.

Context is a ConversationContext variable.

The idea is to give it a prompt that it should follow the whole conversation through, however currently, that needs to be injected through instructions being added Before my prompt. This is not Ideal, as it defeats the purpose of it being an LLM.

What i want to do is: Inject instruction on launch only. Call prompts as normal. Have it follow through with the instruction when needed.

I tried context += But thats not a valid method. I have looked for a solution to this, but I only found one other question regarding this and it was in feb.


Solution

  • Thanks To Conway i was able to fix my Code. Due to the fact that I am used to programming in Unity I always used Void instead of Task cause I didn't know about it, so to fix what I had, I did:

    `
    public async Task Prompt(string prompt)
    {
        output = "";
        string input = prompt;
        context = await ollama.StreamCompletion(input, context, stream => output += (stream.Response));
        Console.WriteLine(output);
        ConsoleCommands.aIResponse = true;
        ConsoleCommands.checkCommand(output);
    }
    `
    

    Thanks to this, I now have an AI integrated program, that Remembers Prior prompts and conversations.

    I hope that this will help someone else in the future!