33827a1ba8
Bumps [Microsoft.SemanticKernel.Abstractions](https://github.com/microsoft/semantic-kernel) from 1.1.0 to 1.4.0. - [Release notes](https://github.com/microsoft/semantic-kernel/releases) - [Commits](https://github.com/microsoft/semantic-kernel/compare/dotnet-1.1.0...dotnet-1.4.0) --- updated-dependencies: - dependency-name: Microsoft.SemanticKernel.Abstractions dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> |
||
---|---|---|
.. | ||
ChatCompletion | ||
TextCompletion | ||
TextEmbedding | ||
ExtensionMethods.cs | ||
LLamaSharp.SemanticKernel.csproj | ||
README.md |
README.md
LLamaSharp.SemanticKernel
LLamaSharp.SemanticKernel are connections for SemanticKernel: an SDK for integrating various LLM interfaces into a single implementation. With this, you can add local LLaMa queries as another connection point with your existing connections.
For reference on how to implement it, view the following examples:
ITextCompletion
using var model = LLamaWeights.LoadFromFile(parameters);
// LLamaSharpTextCompletion can accept ILLamaExecutor.
var ex = new StatelessExecutor(model, parameters);
var builder = new KernelBuilder();
builder.WithAIService<ITextCompletion>("local-llama", new LLamaSharpTextCompletion(ex), true);
IChatCompletion
using var model = LLamaWeights.LoadFromFile(parameters);
using var context = model.CreateContext(parameters);
// LLamaSharpChatCompletion requires InteractiveExecutor, as it's the best fit for the given command.
var ex = new InteractiveExecutor(context);
var chatGPT = new LLamaSharpChatCompletion(ex);
ITextEmbeddingGeneration
using var model = LLamaWeights.LoadFromFile(parameters);
var embedding = new LLamaEmbedder(model, parameters);
var kernelWithCustomDb = Kernel.Builder
.WithLoggerFactory(ConsoleLogger.LoggerFactory)
.WithAIService<ITextEmbeddingGeneration>("local-llama-embed", new LLamaSharpEmbeddingGeneration(embedding), true)
.WithMemoryStorage(new VolatileMemoryStore())
.Build();