LLamaSharp/LLama.SemanticKernel
Tim Miller d4a57fffef README, Cleanup 2023-09-01 10:03:34 +09:00
..
ChatCompletion Change Namespace 2023-08-31 22:22:38 +09:00
TextCompletion Change Namespace 2023-08-31 22:22:38 +09:00
ExtensionMethods.cs Change Namespace 2023-08-31 22:22:38 +09:00
LLamaSharp.SemanticKernel.csproj Change Namespace 2023-08-31 22:22:38 +09:00
README.md README, Cleanup 2023-09-01 10:03:34 +09:00

README.md

LLamaSharp.SemanticKernel

LLamaSharp.SemanticKernel are connections for SemanticKernel: an SDK for intergrating various LLM interfaces into a single implementation. With this, you can add local LLaMa queries as another connection point with your existing connections.

For reference on how to implement it, view the following examples:

ITextCompletion

using var model = LLamaWeights.LoadFromFile(parameters);
// LLamaSharpTextCompletion can accept ILLamaExecutor. 
var ex = new StatelessExecutor(model, parameters);
var builder = new KernelBuilder();
builder.WithAIService<ITextCompletion>("local-llama", new LLamaSharpTextCompletion(ex), true);

IChatCompletion

using var model = LLamaWeights.LoadFromFile(parameters);
using var context = model.CreateContext(parameters);
// LLamaSharpChatCompletion requires InteractiveExecutor, as it's the best fit for the given command.
var ex = new InteractiveExecutor(context);
var chatGPT = new LLamaSharpChatCompletion(ex);