Search code examples
perlollama

AI::Ollama::Client and 'ollama/ollama-curated.yaml'


I installed AI::Ollama::Client on Strawberrry Perl 5.38 and started the Ollama server.

But, when trying to connect to the Ollama server using AI::Ollama::Client

use strict;
use warnings;
use AI::Ollama::Client;
my $client = AI::Ollama::Client->new(
    server => 'http://127.0.0.1:11434',
);

my $info = $client->listModels()->get;
for my $model ($info->models->@*) {
    say $model->model; # llama2:latest
}

I am getting this error:

Could not open 'ollama/ollama-curated.yaml' for reading: No such file or directory at C:/Dev/Perl/strawberry-perl-5.38.2.2-64bit-portable/perl/site/lib/YAML/PP/Lexer.pm line 141. at C:/Dev/Perl/strawberry-perl-5.38.2.2-64bit-portable/perl/site/lib/YAML/PP/Loader.pm line 94.

Any idea how to fix it?


Solution

  • I see the files in the distro, but I don't see anything that would cause them to be installed (copied). This appears to be a bug in the distro, and a ticket has been filed.

    As a workaround, you can copy the file from the distro into your project, and use the following:

    use AI::Ollama::Client qw( );
    use FindBin            qw( $RealBin );
    use YAML::PP           qw( );
    
    my $yaml_parser = YAML::PP->new( boolean => 'JSON::PP' );
    my $schema = $yaml_parser->load_file( "$FindBin/ollama-curated.yaml" );
    
    my $client = AI::Ollama::Client->new(
       server => ...,
       schema => $schema,
    );