.. |
DecodeResult.cs
|
Swapped `StatelessExecutor` to use `llama_decode`!
|
2024-01-20 21:18:35 +00:00 |
GGMLType.cs
|
Added a comment on the type itself
|
2023-12-14 01:23:44 +00:00 |
GPUSplitMode.cs
|
Updated everything to work with llama.cpp ce32060198b7e2d6a13a9b8e1e1369e3c295ae2a
|
2024-02-01 16:35:05 +00:00 |
GroupDisposable.cs
|
- Added `GroupDisposable` to dispose a collection of items all together
|
2023-12-14 01:23:45 +00:00 |
LLamaBatch.cs
|
Fixed off by one error in LLamaBatch sampling position (#626)
|
2024-03-25 22:56:26 +00:00 |
LLamaBeamView.cs
|
April 2024 Binary Update (#662)
|
2024-04-16 23:19:47 +01:00 |
LLamaBeamsState.cs
|
April 2024 Binary Update (#662)
|
2024-04-16 23:19:47 +01:00 |
LLamaChatMessage.cs
|
March Binary Update (#565)
|
2024-03-06 15:19:42 +00:00 |
LLamaContextParams.cs
|
- Added `LLamaWeights.LoadFromFileAsync`.
|
2024-04-27 02:52:41 +01:00 |
LLamaFtype.cs
|
April 2024 Binary Update (#662)
|
2024-04-16 23:19:47 +01:00 |
LLamaGrammarElement.cs
|
Improved test coverage. Discovered some issues:
|
2023-11-18 02:40:36 +00:00 |
LLamaKvCacheView.cs
|
April 2024 Binary Update (#662)
|
2024-04-16 23:19:47 +01:00 |
LLamaLogLevel.cs
|
Logging interceptor (#649)
|
2024-04-05 16:42:27 +01:00 |
LLamaModelMetadataOverride.cs
|
March Binary Update (#565)
|
2024-03-06 15:19:42 +00:00 |
LLamaModelParams.cs
|
- Added `LLamaWeights.LoadFromFileAsync`.
|
2024-04-27 02:52:41 +01:00 |
LLamaModelQuantizeParams.cs
|
April 2024 Binary Update (#662)
|
2024-04-16 23:19:47 +01:00 |
LLamaNativeBatch.cs
|
Removed `LLamaBatchSafeHandle` (using unmanaged memory, created by llama.cpp) and replaced it with a fully managed `LLamaBatch`. Modified the `BatchedDecoding` example to use new managed batch.
|
2024-01-19 23:26:36 +00:00 |
LLamaPoolingType.cs
|
April 2024 Binary Update (#662)
|
2024-04-16 23:19:47 +01:00 |
LLamaPos.cs
|
Added increment and decrement operators to `LLamaPos`
|
2024-02-07 17:04:57 +00:00 |
LLamaRopeType.cs
|
March Binary Update (#565)
|
2024-03-06 15:19:42 +00:00 |
LLamaSeqId.cs
|
Removed `LLamaBatchSafeHandle` (using unmanaged memory, created by llama.cpp) and replaced it with a fully managed `LLamaBatch`. Modified the `BatchedDecoding` example to use new managed batch.
|
2024-01-19 23:26:36 +00:00 |
LLamaToken.cs
|
Added a test with examples of troublesome strings from 0.9.1
|
2024-01-16 15:02:23 +00:00 |
LLamaTokenData.cs
|
made casts to/from int explicit, fixed places affected
|
2024-01-02 20:57:37 +00:00 |
LLamaTokenDataArray.cs
|
Classifier Free Guidance (#536)
|
2024-02-26 15:41:57 +00:00 |
LLamaTokenType.cs
|
Assorted small changes to clean up some code warnings
|
2024-02-17 23:07:10 +00:00 |
LLamaVocabType.cs
|
April 2024 Binary Update (#662)
|
2024-04-16 23:19:47 +01:00 |
LLavaImageEmbed.cs
|
Llava api (#563)
|
2024-03-13 22:10:44 +00:00 |
NativeApi.BeamSearch.cs
|
Code cleanup driven by R# suggestions:
|
2024-01-02 03:20:21 +00:00 |
NativeApi.Grammar.cs
|
- Removed unnecessary constructors from safe handles
|
2024-04-26 01:03:26 +01:00 |
NativeApi.LLava.cs
|
fix: typos.
|
2024-04-29 18:19:20 +08:00 |
NativeApi.Load.cs
|
fix: typos.
|
2024-04-29 18:19:20 +08:00 |
NativeApi.Quantize.cs
|
Modified `llama_model_quantize` to accept argument by `ref` instead of pointer.
|
2024-04-26 01:35:13 +01:00 |
NativeApi.Sampling.cs
|
Classifier Free Guidance (#536)
|
2024-02-26 15:41:57 +00:00 |
NativeApi.cs
|
- Added tests for generating embeddings with generative model and embedding model
|
2024-04-19 16:30:32 +01:00 |
NativeLibraryConfig.cs
|
fix: typos.
|
2024-04-29 18:19:20 +08:00 |
NativeLogConfig.cs
|
Logging interceptor (#649)
|
2024-04-05 16:42:27 +01:00 |
RopeScalingType.cs
|
March Binary Update (#565)
|
2024-03-06 15:19:42 +00:00 |
SafeLLamaContextHandle.cs
|
Add method to get BOS token.
|
2024-05-02 23:29:33 -06:00 |
SafeLLamaGrammarHandle.cs
|
Using `is` check instead of `== null`
|
2024-04-26 13:53:04 +01:00 |
SafeLLamaHandleBase.cs
|
- Removed one of the constructors of `SafeLLamaHandleBase`, which implicitly states that memory is owned. Better to be explicit about this kind of thing!
|
2024-01-31 18:01:03 +00:00 |
SafeLlamaModelHandle.cs
|
- Added `LLamaWeights.LoadFromFileAsync`.
|
2024-04-27 02:52:41 +01:00 |
SafeLlavaImageEmbedHandle.cs
|
- Removed unnecessary constructors from safe handles
|
2024-04-26 01:03:26 +01:00 |
SafeLlavaModelHandle.cs
|
- Added `LoadFromFileAsync` method for `LLavaWeights`
|
2024-04-27 23:31:07 +01:00 |