diff --git a/LLama.Examples/LLama.Examples.csproj b/LLama.Examples/LLama.Examples.csproj
index dcd40c19..02066fa0 100644
--- a/LLama.Examples/LLama.Examples.csproj
+++ b/LLama.Examples/LLama.Examples.csproj
@@ -29,7 +29,7 @@
-
+
diff --git a/LLama.KernelMemory/LLamaSharp.KernelMemory.csproj b/LLama.KernelMemory/LLamaSharp.KernelMemory.csproj
index 54766b02..7fd99e2c 100644
--- a/LLama.KernelMemory/LLamaSharp.KernelMemory.csproj
+++ b/LLama.KernelMemory/LLamaSharp.KernelMemory.csproj
@@ -4,6 +4,27 @@
net6.0
enable
enable
+
+ 0.7.1
+ Xbotter
+ SciSharp STACK
+ true
+ MIT, SciSharp STACK $([System.DateTime]::UtcNow.ToString(yyyy))
+ https://github.com/SciSharp/LLamaSharp
+ git
+ https://avatars3.githubusercontent.com/u/44989469?s=200&v=4
+ LLama, LLM, GPT, ChatGPT, kernel-memory, vector search, SciSharp
+
+ The integration of LLamaSharp and Microsoft kernel-memory. It could make it easy to support document search for LLamaSharp model inference.
+
+
+ Support integration with kernel-memory
+
+ MIT
+ packages
+ AnyCPU;x64;Arm64
+ LLamaSharp.kernel-memory
+ Debug;Release;GPU
diff --git a/LLama.SemanticKernel/LLamaSharp.SemanticKernel.csproj b/LLama.SemanticKernel/LLamaSharp.SemanticKernel.csproj
index 77596d57..c6ece4e7 100644
--- a/LLama.SemanticKernel/LLamaSharp.SemanticKernel.csproj
+++ b/LLama.SemanticKernel/LLamaSharp.SemanticKernel.csproj
@@ -10,8 +10,8 @@
enable
enable
- 0.6.2-beta1
- Tim Miller
+ 0.7.1
+ Tim Miller, Xbotter
SciSharp STACK
true
MIT, SciSharp STACK $([System.DateTime]::UtcNow.ToString(yyyy))
@@ -20,7 +20,7 @@
https://avatars3.githubusercontent.com/u/44989469?s=200&v=4
LLama, LLM, GPT, ChatGPT, semantic-kernel, SciSharp
- The integration of LLamaSharp ans semantic-kernel.
+ The integration of LLamaSharp and Microsoft semantic-kernel.
Support integration with semantic-kernel
diff --git a/README.md b/README.md
index 397f641e..74d5aee6 100644
--- a/README.md
+++ b/README.md
@@ -54,6 +54,12 @@ For [microsoft semantic-kernel](https://github.com/microsoft/semantic-kernel) in
LLamaSharp.semantic-kernel
```
+For [microsoft kernel-memory](https://github.com/microsoft/kernel-memory) integration, please search and install the following package (currently kernel-memory only supports net6.0):
+
+```
+LLamaSharp.kernel-memory
+```
+
### Tips for choosing a version
In general, there may be some break changes between two minor releases, for example 0.5.1 and 0.6.0. On the contrary, we don't introduce API break changes in patch release. Therefore it's recommended to keep the highest patch version of a minor release. For example, keep 0.5.6 instead of 0.5.3.
@@ -196,7 +202,7 @@ Another choice is generate gguf format file yourself with a pytorch weight (or a
🔳 Fine-tune
-⚠️ Local document search (enabled by kernel-memory now)
+✅ Local document search (enabled by kernel-memory now)
🔳 MAUI Integration