LLamaSharp/LLama.Web
Martin Evans b34f72a883 - Added `SamplingPipeline` to inference params which overrides all other options with an entirely custom pipeline.
- Added a `Sample` method to `LLamaContext` which uses a custom pipeline
 - Modified all executors to use the custom pipeline if it exists
2023-12-08 01:02:27 +00:00
..
Async Update Web to support version 0.5.1 2023-10-04 12:57:15 +13:00
Common - Added `SamplingPipeline` to inference params which overrides all other options with an entirely custom pipeline. 2023-12-08 01:02:27 +00:00
Hubs Fix up issues found during testing 2023-10-04 16:32:13 +13:00
Models Added logger parameter in to LLama.Web context creation 2023-10-19 21:24:02 +01:00
Pages Fix up issues found during testing 2023-10-04 16:32:13 +13:00
Services Fix up issues found during testing 2023-10-04 16:32:13 +13:00
wwwroot Fix up issues found during testing 2023-10-04 16:32:13 +13:00
Extensions.cs Fix up issues found during testing 2023-10-04 16:32:13 +13:00
LLama.Web.csproj Add targets in Web project 2023-11-13 23:26:29 +01:00
Program.cs Fix up issues found during testing 2023-10-04 16:32:13 +13:00
README.md Update Readme 2023-07-16 20:05:21 +12:00
appsettings.Development.json Basic ASP.NET Core website example, Interactive started 2023-07-16 19:31:27 +12:00
appsettings.json Fix up issues found during testing 2023-10-04 16:32:13 +13:00

README.md

LLama.Web - Basic ASP.NET Core examples of LLamaSharp in action

LLama.Web has no heavy dependencies and no extra frameworks ove bootstrap and jquery to keep the examples clean and easy to copy over to your own project

Websockets

Using signalr websockets simplifys the streaming of responses and model per connection management

Setup

You can setup Models, Prompts and Inference parameters in the appsettings.json

Models You can add multiple models to the options for quick selection in the UI, options are based on ModelParams so its fully configurable

Parameters You can add multiple sets of inference parameters to the options for quick selection in the UI, options are based on InferenceParams so its fully configurable

Prompts You can add multiple sets of prompts to the options for quick selection in the UI

Example:

 {
        "Name": "Alpaca",
        "Path": "D:\\Repositories\\AI\\Prompts\\alpaca.txt",
        "Prompt": "Alternativly to can set a prompt text directly and omit the Path"
        "AntiPrompt": [
          "User:"
        ],
        "OutputFilter": [
          "Response:",
          "User:"
        ]
  }

Interactive UI

The interactive UI is a simple example of using LLammaSharp demo-ui