Commit Graph

41 Commits

Author SHA1 Message Date
Rinne 6bf010d719
Merge pull request #689 from zsogitbe/master
SemanticKernel: Correcting non-standard way of working with PromptExecutionSettings
2024-05-01 01:52:43 +08:00
Zoli Somogyi 54c01d4c2c Making old code obsolete - SemanticKernel: Correcting working with PromptExecutionSettings 2024-04-30 19:28:31 +02:00
Zoli Somogyi 2aa96b206f Adding Response Format - Correcting non-standard way of working with PromptExecutionSettings
can be used downstream to post-process the messages based on the requested format
2024-04-27 09:39:40 +02:00
Zoli Somogyi 59a0afdb77 Renaming files to correspond to class names 2024-04-24 08:24:02 +02:00
Zoli Somogyi ab8dd0dfc7 Correcting non-standard way of working with PromptExecutionSettings
The extension of PromptExecutionSettings is not only for ChatCompletion, but also for text completion and text embedding.
2024-04-24 08:06:40 +02:00
Chirag Karia 50e139b0a2
Update LLamaSharpChatCompletion Semantic Kernel inference to send only the most recent user message in SK ChatHistory instance when using StatefulExecutor models 2024-04-17 12:57:55 -04:00
Rinne 4640c6af04 release: update release info of packages. 2024-04-06 14:20:36 +08:00
Rinne d67658a0d6
docs: update the information to v0.11.0. 2024-04-01 01:38:40 +08:00
xbotter a019b5cc24
📝 Update LLamaSharpChatCompletion and LLama.Unittest
- Updated LLamaSharpChatCompletion class in LLama.SemanticKernel/ChatCompletion/LLamaSharpChatCompletion.cs
  - Changed the type of the "_model" field from "StatelessExecutor" to "ILLamaExecutor"
  - Updated the constructor to accept an "ILLamaExecutor" parameter instead of a "StatelessExecutor" parameter
- Updated LLamaSharpChatCompletion class in LLama.SemanticKernel/LLamaSharp.SemanticKernel.csproj

- Updated LLama.Unittest project in LLama.Unittest/LLama.Unittest.csproj
  - Added a "PackageReference" for "Moq" version 4.20.70
- Added ExtensionMethodsTests class in LLama.Unittest/SemanticKernel/ExtensionMethodsTests.cs
  - Added tests for the "ToLLamaSharpChatHistory" and "ToLLamaSharpInferenceParams" extension methods
- Added LLamaSharpChatCompletionTests class in LLama.Unittest/SemanticKernel/LLamaSharpChatCompletionTests.cs
  - Added tests for the LLamaSharpChatCompletion class

ℹ️ The LLamaSharpChatCompletion class in the LLama.SemanticKernel project has been updated to use the ILLamaExecutor interface instead of the StatelessExecutor class. This change allows for better abstraction and flexibility in the implementation of the LLamaSharpChatCompletion class. The LLamaSharpChatCompletion class is responsible for providing chat completion functionality in the LLamaSharp project. The LLama.Unittest project has also been updated to include tests for the LLamaSharpChatCompletion class and the extension methods used by the class.
2024-03-18 21:49:52 +08:00
xbotter 3f2e5c27ff
🔧 Update package references
- Update Microsoft.KernelMemory.Core to version 0.34.240313.1
- Update Microsoft.SemanticKernel to version 1.6.2
- Update Microsoft.SemanticKernel.Plugins.Memory to version 1.6.2-alpha
- Update Microsoft.KernelMemory.Abstractions to version 0.34.240313.1
- Update Microsoft.SemanticKernel.Abstractions to version 1.6.2
2024-03-14 22:17:59 +08:00
dependabot[bot] 33827a1ba8
build(deps): bump Microsoft.SemanticKernel.Abstractions (#542)
Bumps [Microsoft.SemanticKernel.Abstractions](https://github.com/microsoft/semantic-kernel) from 1.1.0 to 1.4.0.
- [Release notes](https://github.com/microsoft/semantic-kernel/releases)
- [Commits](https://github.com/microsoft/semantic-kernel/compare/dotnet-1.1.0...dotnet-1.4.0)

---
updated-dependencies:
- dependency-name: Microsoft.SemanticKernel.Abstractions
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-26 14:09:11 +00:00
Martin Evans c9c8cd0d62 - Swapped embeddings generator to use `llama_decode`
- Modified `GetEmbeddings` method to be async
2024-01-31 20:28:53 +00:00
xbotter 90815ae7d8
bump sk & km
- bump semantic kernel to 1.1.0
- bump kernel memory to 0.26
2024-01-22 19:03:28 +08:00
Aleksei Smirnov cc892a5eed Fix typos in SemanticKernel README file 2024-01-05 22:21:53 +03:00
xbotter 40ac944fb5
Bump sk to 1.0.1 2023-12-19 08:42:01 +08:00
xbotter 213b4be723
bump sk-1.0.0-rc4 2023-12-14 09:47:32 +08:00
xbotter 13a312b4ec
update sk to 1.0.0-rc3 & km to 0.18 2023-12-11 19:39:01 +08:00
xbotter a2b26faa7a
🔧 Refactor chat completion implementation
- Refactored the chat completion implementation in `LLamaSharpChatCompletion.cs` to use `StatelessExecutor` instead of `InteractiveExecutor`.
- Updated the chat history prompt in `LLamaSharpChatCompletion.cs` to include a conversation between the assistant and the user.
- Modified the `HistoryTransform` class in `HistoryTransform.cs` to append the assistant role to the chat history prompt.
- Updated the constructor of `LLamaSharpChatCompletion` to accept optional parameters for `historyTransform` and `outputTransform`.
- Modified the `GetChatCompletionsAsync` and `GetChatCompletions` methods in `LLamaSharpChatCompletion.cs` to use the new `StatelessExecutor` and `outputTransform`.
- Updated the `ExtensionMethods.cs` file to include the assistant and system roles in the list of anti-prompts.
2023-12-01 21:39:31 +08:00
Ian Foutz 060d7c273d Added a converter similar to the Open AI one 2023-11-18 21:42:34 -06:00
xbotter 15db931a66
bump semantic kernel 1.0.0-beta8 2023-11-17 21:17:27 +08:00
xbotter 6c31f69720
bump semantic kernel to 1.0.0-beta-6 2023-11-15 15:53:06 +08:00
Yaohui Liu 502bb73b1e
fix typo. 2023-11-12 12:26:33 +08:00
Yaohui Liu d7675f7936
Merge branch 'master' of github.com:AsakusaRinne/LLamaSharp into cuda_detection 2023-11-12 12:10:31 +08:00
Yaohui Liu 4d2c5f1003
build: change nuget configuration for cuda detection. 2023-11-12 05:23:25 +08:00
Chirag Karia aa5e1ad541 Add ignoreCase parameter to ToLLamaSharpChatHistory extension method 2023-11-11 02:59:57 -05:00
Chirag Karia 1b4659dff9 Update ToLLamaSharpChatHistory extension method to be public and support semantic-kernel author roles 2023-11-11 00:43:11 -05:00
Yaohui Liu 6a7e74e71b
build: add package for kernel-memory integration. 2023-11-04 22:38:06 +08:00
xbotter ab83016fb4
chore: update semantic kernel examples 2023-10-20 10:24:40 +08:00
Daniel Vaughan 10a7d68330 Change to nullable cast in LLamaSharpTextCompletion. 2023-10-13 17:03:19 +02:00
Daniel Vaughan f64a54c9c8 Support SemanticKernel 1.0.0-beta1 2023-10-13 13:56:21 +02:00
Tim Miller abeab9f0a1 Bump dependencies 2023-09-12 12:19:57 +09:00
Tim Miller dced651f8b Allow setting ChatRequestSettings Defaults and ChatSession 2023-09-11 19:21:51 +09:00
sa_ddam213 70b36f8996
Add Microsoft.Extensions.Logging.Abstractions, update any required deps 2023-09-09 09:52:11 +12:00
Yaohui Liu 3067e01a4b
build: add package info for LLamaSharp.semantic-kernel 2023-09-06 02:50:23 +08:00
Tim Miller 94a395240a Bump example, readme 2023-09-02 14:21:02 +09:00
Tim Miller 521f068d64 Add Embedding for Semantic Kernel 2023-09-01 22:42:13 +09:00
Tim Miller d4a57fffef README, Cleanup 2023-09-01 10:03:34 +09:00
Tim Miller 2bde188c64 Change Namespace 2023-08-31 22:22:38 +09:00
Tim Miller a81edacbfb Remove embedding for now 2023-08-31 18:26:03 +09:00
Tim Miller 98bfbe1d50 Update 2023-08-31 18:24:07 +09:00
Tim Miller 9a1d6f99f2 Add Semantic Kernel support 2023-08-31 17:24:44 +09:00