weekly

GitHub C# Trending

The latest build: 2024-07-19Source of data: GitHubTrendingRSS

RAG architecture: index and query any data using LLM and natural language, track sources, show citations, asynchronous memory patterns.


Kernel Memory

License: MITDiscord

This repository presents best practices and a reference architecture for memory in specific AI and LLMs application scenarios. Please note that the provided code serves as a demonstration and is not an officially supported Microsoft offering.

Kernel Memory (KM) is a multi-modal AI Service specialized in the efficient indexing of datasets through custom continuous data hybrid pipelines, with support for Retrieval Augmented Generation (RAG), synthetic memory, prompt engineering, and custom semantic memory processing.

KM is available as a Web Service, as a Docker container, a Plugin for ChatGPT/Copilot/Semantic Kernel, and as a .NET library for embedded applications.

image

Utilizing advanced embeddings and LLMs, the system enables Natural Language querying for obtaining answers from the indexed data, complete with citations and links to the original sources.

image

Designed for seamless integration as a Plugin with Semantic Kernel, Microsoft Copilot and ChatGPT, Kernel Memory enhances data-driven features in applications built for most popular AI platforms.

Synchronous Memory API (aka "serverless")

Kernel Memory works and scales at best when running as an asynchronous Web Service, allowing to ingest thousands of documents and information without blocking your app.

However, Kernel Memory can also run in serverless mode, embedding MemoryServerless class instance in .NET backend/console/desktop apps in synchronous mode. This approach works as well as in ASP.NET Web APIs and Azure Functions. Each request is processed immediately, although calling clients are responsible for handling transient errors.

image

Importing documents into your Kernel Memory can be as simple as this:

var memory = new KernelMemoryBuilder() .WithOpenAIDefaults(Environment.GetEnvironmentVariable("OPENAI_API_KEY")) .Build<MemoryServerless>();// Import a fileawait memory.ImportDocumentAsync("meeting-transcript.docx", tags: new() { { "user", "Blake" } });// Import multiple files and apply multiple tagsawait memory.ImportDocumentAsync(new Document("file001") .AddFile("business-plan.docx") .AddFile("project-timeline.pdf") .AddTag("user", "Blake") .AddTag("collection", "business") .AddTag("collection", "plans") .AddTag("fiscalYear", "2023"));

Asking questions:

var answer1 = await memory.AskAsync("How many people attended the meeting?");var answer2 = await memory.AskAsync("what's the project timeline?", filter: new MemoryFilter().ByTag("user", "Blake"));

The example leverages the default documents ingestion pipeline:

  1. Extract text: recognize the file format and extract the information
  2. Partition the text in small chunks, to optimize search
  3. Extract embedding using an LLM embedding generator
  4. Save embedding into a vector index such as Azure AI Search, Qdrant or other DBs.

In the example, memories are organized by users using tags, safeguarding private information. Furthermore, memories can be categorized and structured using tags, enabling efficient search and retrieval through faceted navigation.

Data lineage, citations, referencing sources:

All memories and answers are fully correlated to the data provided. When producing an answer, Kernel Memory includes all the information needed to verify its accuracy:

await memory.ImportFileAsync("NASA-news.pdf");var answer = await memory.AskAsync("Any news from NASA about Orion?");Console.WriteLine(answer.Result + "/n");foreach (var x in answer.RelevantSources){ Console.WriteLine($" * {x.SourceName} -- {x.Partitions.First().LastUpdate:D}");}

Yes, there is news from NASA about the Orion spacecraft. NASA has invited the media to see a new test version [......] For more information about the Artemis program, you can visit the NASA website.

  • NASA-news.pdf -- Tuesday, August 1, 2023

Memory as a Service - Asynchronous API

Depending on your scenarios, you might want to run all the code locally inside your process, or remotely through an asynchronous and scalable service.

image

If you're importing small files, and need only C# and can block the process during the import, local-in-process execution can be fine, using the MemoryServerless seen above.

However, if you are in one of these scenarios:

  • I'd just like a web service to import data and send queries to answer
  • My app is written in TypeScript, Java, Rust, or some other language
  • I'm importing big documents that can require minutes to process, and I don't want to block the user interface
  • I need memory import to run independently, supporting failures and retry logic
  • I want to define custom pipelines mixing multiple languages like Python, TypeScript, etc

then you can deploy Kernel Memory as a backend service, plugging in the default handlers, or your custom Python/TypeScript/Java/etc. handlers, and leveraging the asynchronous non-blocking memory encoding process, sending documents and asking questions using the MemoryWebClient.

Here you can find a complete set of instruction about how to run the Kernel Memory service.

Kernel Memory (KM) and SK Semantic Memory (SM)

Kernel Memory (KM) is a service built on the feedback received and lessons learned from developing Semantic Kernel (SK) and Semantic Memory (SM). It provides several features that would otherwise have to be developed manually, such as storing files, extracting text from files, providing a framework to secure users' data, etc. The KM codebase is entirely in .NET, which eliminates the need to write and maintain features in multiple languages. As a service, KM can be used from any language, tool, or platform, e.g. browser extensions and ChatGPT assistants.

Semantic Memory (SM) is a library for C#, Python, and Java that wraps direct calls to databases and supports vector search. It was developed as part of the Semantic Kernel (SK) project and serves as the first public iteration of long-term memory. The core library is maintained in three languages, while the list of supported storage engines (known as "connectors") varies across languages.

Here's comparison table:

FeatureKernel MemorySemantic Memory
Data formatsWeb pages, PDF, Images, Word, PowerPoint, Excel, Markdown, Text, JSON, HTMLText only
SearchCosine similarity, Hybrid search with filters (AND/OR conditions)Cosine similarity
Language supportAny language, command line tools, browser extensions, low-code/no-code apps, chatbots, assistants, etc.C#, Python, Java
Storage enginesAzure AI Search, Elasticsearch, MongoDB Atlas, Postgres+pgvector, Qdrant, Redis, SQL Server, In memory KNN, On disk KNN.Azure AI Search, Chroma, DuckDB, Kusto, Milvus, MongoDB, Pinecone, Postgres, Qdrant, Redis, SQLite, Weaviate
File storageDisk, Azure Blobs, AWS S3, MongoDB Atlas, In memory (volatile)-
RAGYes, with sources lookup-
SummarizationYes-
OCRYes via Azure Document Intelligence-
Security FiltersYes-
Large document ingestionYes, including async processing using queues (Azure Queues, RabbitMQ, File based or In memory queues)-
Document storageYes-
Custom storage schemasome DBs-
Vector DBs with internal embeddingYes-
Concurrent write to multiple vector DBsYes-
LLMsAzure OpenAI, OpenAI, Anthropic, LLamaSharp via llama.cpp, LM Studio, Semantic Kernel connectorsAzure OpenAI, OpenAI, Gemini, Hugging Face, ONNX, custom ones, etc.
LLMs with dedicated tokenizationYesNo
Cloud deploymentYes-
Web service with OpenAPIYes-

Quick test using the Docker image

If you want to give the service a quick test, use the following command to start the Kernel Memory Service using OpenAI:

docker run -e OPENAI_API_KEY="..." -it --rm -p 9001:9001 kernelmemory/service

If you prefer using custom settings and services such as Azure OpenAI, Azure Document Intelligence, etc., you should create an appsettings.Development.json file overriding the default values set in appsettings.json, or using the configuration wizard included:

cd service/Servicedotnet run setup

Then run this command to start the Docker image with the configuration just created:

on Windows:

docker run --volume .\appsettings.Development.json:/app/appsettings.Production.json -it --rm -p 9001:9001 kernelmemory/service

on macOS/Linux:

docker run --volume ./appsettings.Development.json:/app/appsettings.Production.json -it --rm -p 9001:9001 kernelmemory/service

Import files using KM web service and MemoryWebClient

#reference clients/WebClient/WebClient.csprojvar memory = new MemoryWebClient("http://127.0.0.1:9001"); // <== URL where the web service is running// Import a file (default user)await memory.ImportDocumentAsync("meeting-transcript.docx");// Import a file specifying a Document ID, User and Tagsawait memory.ImportDocumentAsync("business-plan.docx", new DocumentDetails("[email protected]", "file001") .AddTag("collection", "business") .AddTag("collection", "plans") .AddTag("fiscalYear", "2023"));

Get answers via the web service

curl http://127.0.0.1:9001/ask -d'{"query":"Any news from NASA about Orion?"}' -H 'Content-Type: application/json'
{ "Query": "Any news from NASA about Orion?", "Text": "Yes, there is news from NASA about the Orion spacecraft. NASA has invited the media to see a new test version [......] For more information about the Artemis program, you can visit the NASA website.", "RelevantSources": [ { "Link": "...", "SourceContentType": "application/pdf", "SourceName": "file5-NASA-news.pdf", "Partitions": [ { "Text": "Skip to main content\nJul 28, 2023\nMEDIA ADVISORY M23-095\nNASA Invites Media to See Recovery Craft for\nArtemis Moon Mission\n(/sites/default/les/thumbnails/image/ksc-20230725-ph-fmx01_0003orig.jpg)\nAboard the [......] to Mars (/topics/moon-to-\nmars/),Orion Spacecraft (/exploration/systems/orion/index.html)\nNASA Invites Media to See Recovery Craft for Artemis Moon Miss... https://www.nasa.gov/press-release/nasa-invites-media-to-see-recov...\n2 of 3 7/28/23, 4:51 PM", "Relevance": 0.8430657, "SizeInTokens": 863, "LastUpdate": "2023-08-01T08:15:02-07:00" } ] } ]}

You can find a full example here.

Custom memory ingestion pipelines

On the other hand, if you need a custom data pipeline, you can also customize the steps, which will be handled by your custom business logic:

// Memory setup, e.g. how to calculate and where to store embeddingsvar memoryBuilder = new KernelMemoryBuilder() .WithoutDefaultHandlers() .WithOpenAIDefaults(Environment.GetEnvironmentVariable("OPENAI_API_KEY"));var memory = memoryBuilder.Build();// Plug in custom .NET handlersmemory.Orchestrator.AddHandler<MyHandler1>("step1");memory.Orchestrator.AddHandler<MyHandler2>("step2");memory.Orchestrator.AddHandler<MyHandler3>("step3");// Use the custom handlers with the memory objectawait memory.ImportDocumentAsync( new Document("mytest001") .AddFile("file1.docx") .AddFile("file2.pdf"), steps: new[] { "step1", "step2", "step3" });

Web API specs with OpenAI swagger

The API schema is available at http://127.0.0.1:9001/swagger/index.html when running the service locally with OpenAPI enabled.

Examples and Tools

Examples

  1. Collection of Jupyter notebooks with various scenarios
  2. Using Kernel Memory web service to upload documents and answer questions
  3. Importing files and asking question without running the service (serverless mode)
  4. Using KM Plugin for Semantic Kernel
  5. Processing files with custom logic (custom handlers) in serverless mode
  6. Processing files with custom logic (custom handlers) in asynchronous mode
  7. Upload files and ask questions from command line using curl
  8. Customizing RAG and summarization prompts
  9. Custom partitioning/text chunking options
  10. Using a custom embedding/vector generator
  11. Using custom LLMs
  12. Using LLama
  13. Summarizing documents, using synthetic memories
  14. Using Semantic Kernel LLM connectors
  15. Using custom content decoders
  16. Using a custom web scraper to fetch web pages
  17. Generating answers with Anthropic LLMs
  18. Hybrid Search with Azure AI Search
  19. Writing and using a custom ingestion handler
  20. Running a single asynchronous pipeline handler as a standalone service
  21. Test project using KM package from nuget.org
  22. Integrating Memory with ASP.NET applications and controllers
  23. Sample code showing how to extract text from files
  24. .NET configuration and logging
  25. Expanding chunks retrieving adjacent partitions
  26. Using local models via LM Studio
  27. Using Context Parameters to customize RAG prompt during a request
  28. Creating a Memory instance without KernelMemoryBuilder

Tools

  1. .NET appsettings.json generator
  2. Curl script to upload files
  3. Curl script to ask questions
  4. Curl script to search documents
  5. Script to start Qdrant for development tasks
  6. Script to start Elasticsearch for development tasks
  7. Script to start MS SQL Server for development tasks
  8. Script to start Redis for development tasks
  9. Script to start RabbitMQ for development tasks
  10. Script to start MongoDB Atlas for development tasks

.NET packages

  • Microsoft.KernelMemory.WebClient: .NET web client to call a running instance of Kernel Memory web service.

    Nuget packageExample code

  • Microsoft.KernelMemory.Core: Kernel Memory core library including all extensions, can be used to build custom pipelines and handlers, contains also the serverless client to use memory in a synchronous way without the web service.

    Nuget packageExample code

  • Microsoft.KernelMemory.Service.AspNetCore: an extension to load Kernel Memory into your ASP.NET apps.

    Nuget packageExample code

  • Microsoft.KernelMemory.SemanticKernelPlugin: a Memory plugin for Semantic Kernel, replacing the original Semantic Memory available in SK.

    Nuget packageExample code

Packages for Python, Java and other languages

Kernel Memory service offers a Web API out of the box, including the OpenAPI swagger documentation that you can leverage to test the API and create custom web clients. For instance, after starting the service locally, see http://127.0.0.1:9001/swagger/index.html.

A .NET Web Client and a Semantic Kernel plugin are available, see the nugets packages above.

A python package with a Web Client and Semantic Kernel plugin will soon be available. We also welcome PR contributions to support more languages.

Contributors

aaronpowellafederici75akordowskialexibraimovalkampfergitamomra
aaronpowellafederici75akordowskialexibraimovalkampfergitamomra
anthonypuppochaellicherchykcoryisaksoncrickmandependabot[bot]
anthonypuppochaellicherchykcoryisaksoncrickmandependabot[bot]
dlucDM-98EelcoKosterFoorceeGraemeJones104jurepurgar
dlucDM-98EelcoKosterFoorceeGraemeJones104jurepurgar
kbeaugrandkoteusKSemenenkolecramrluismanezmarcominerva
kbeaugrandkoteusKSemenenkolecramrluismanezmarcominerva
neel015pascalbergerpawarsum12pradeepr-roboticistqihangnetroldengarm
neel015pascalbergerpawarsum12pradeepr-roboticistqihangnetroldengarm
slapointeslorello89spenavajrTaoChenOSUteresaqhoangv-msamovendyuk
slapointeslorello89spenavajrTaoChenOSUteresaqhoangv-msamovendyuk
Valkozaurvicperdanawestdavidrxbotter
Valkozaurvicperdanawestdavidrxbotter

This repository is for active development of the Azure SDK for .NET. For consumers of the SDK we recommend visiting our public developer docs at https://learn.microsoft.com/dotnet/azure/ or our versioned developer docs at https://azure.github.io/azure-sdk-for-net.


Azure SDK for .NET

PackagesDependenciesDependencies Graph

This repository is for active development of the Azure SDK for .NET. For consumers of the SDK we recommend visiting our public developer docs or our versioned developer docs.

Getting started

To get started with a library, see the README.md file located in the library's project folder. You can find these library folders grouped by service in the /sdk directory.

For tutorials, samples, quick starts, and other documentation, go to Azure for .NET Developers.

Packages available

Each service might have a number of libraries available from each of the following categories:

Client: New Releases

New wave of packages that we are announcing as GA and several that are currently releasing in preview. These libraries follow the Azure SDK Design Guidelines for .NET and share a number of core features such as HTTP retries, logging, transport protocols, authentication protocols, etc., so that once you learn how to use these features in one client library, you will know how to use them in other client libraries. You can learn about these shared features at Azure.Core.

These new client libraries can be identified by the naming used for their folder, package, and namespace. Each will start with Azure, followed by the service category, and then the name of the service. For example Azure.Storage.Blobs.

For a complete list of available packages, please see the latest available packages page.

NOTE: If you need to ensure your code is ready for production we strongly recommend using one of the stable, non-preview libraries.

Client: Previous Versions

Last stable versions of packages that are production-ready. These libraries provide similar functionalities to the preview packages, as they allow you to use and consume existing resources and interact with them, for example: upload a storage blob. Stable library directories typically contain 'Microsoft.Azure' in their names, e.g. 'Microsoft.Azure.KeyVault'. They might not implement the guidelines or have the same feature set as the November releases. They do however offer wider coverage of services.

Management: New Releases

A new set of management libraries that follow the Azure SDK Design Guidelines for .NET and based on Azure.Core libraries are now in Public Preview. These new libraries provide a number of core capabilities that are shared amongst all Azure SDKs, including the intuitive Azure Identity library, an HTTP Pipeline with custom policies, error-handling, distributed tracing, and much more. You can find the list of new packages on this page.

To get started with these new libraries, please see the quickstart guide here. These new libraries can be identified by namespaces that start with Azure.ResourceManager, e.g. Azure.ResourceManager.Network

NOTE: If you need to ensure your code is ready for production use one of the stable, non-preview libraries.

Management: Previous Versions

For a complete list of management libraries which enable you to provision and manage Azure resources, please check here. They might not have the same feature set as the new releases but they do offer wider coverage of services. Previous versions of management libraries can be identified by namespaces that start with Microsoft.Azure.Management, e.g. Microsoft.Azure.Management.Network

Documentation and code samples for these libraries can be found here.

Need help?

Community

  • Chat with other community members Join the chat at https://gitter.im/azure/azure-sdk-for-net

Reporting security issues and security bugs

Security issues and bugs should be reported privately, via email, to the Microsoft Security Response Center (MSRC) [email protected]. You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Further information, including the MSRC PGP key, can be found in the Security TechCenter.

We want your thoughts!

Feature Requests

What features are important to you? You can let us know by looking at our open feature requests and sharing your thoughts by giving the issue a thumbs up or thumbs down. (Note the list is sorted by number of thumbs up in descending order.)

Design Discussions

We would love to incorporate the community's input into our library design process. Here's a list of design discussions that we're currently having. Participate in the discussions by leaving your comments in the issue!

Contributing

For details on contributing to this repository, see the contributing guide.

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repositories using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Impressions

Blazor Component Library based on Material design with an emphasis on ease of use. Mainly written in C# with Javascript kept to a bare minimum it empowers .NET developers to easily debug it if needed.


MudBlazor

Material Design components for Blazor

GitHub Workflow StatusCodecovGitHubGitHub Repo starsGitHub last commitContributorsDiscussionsDiscordTwitterNuGet versionNuGet downloads

MudBlazor is an ambitious Material Design component framework for Blazor with an emphasis on ease of use and clear structure. It is perfect for .NET developers who want to rapidly build web applications without having to struggle with CSS and Javascript. MudBlazor, being written entirely in C#, empowers you to adapt, fix or extend the framework. There are plenty of examples in the documentation, which makes understanding and learning MudBlazor very easy.

Documentation & Demo

Why is MudBlazor so successful?

  • Clean and aesthetic graphic design based on Material Design.
  • Clear and easy to understand structure.
  • Good documentation with many examples and source snippets.
  • All components are written entirely in C#, no JavaScript allowed (except where absolutely necessary).
  • Users can make beautiful apps without needing CSS (but they can of course use CSS too).
  • No dependencies on other component libraries, 100% control over components and features.
  • Stability! We strive for a complete test coverage.
  • Releases often so developers can get their PRs and fixes in a timely fashion.

Prerequisites

MudBlazor.NETSupport
1.x.x - 2.0.x.NET 3.1Ended 03/2021
5.x.x.NET 5Ended 01/2022
6.x.x.NET 6, .NET 7, .NET 8
7.x.x.NET 7, .NET 8

Currently only interactive rendering modes are supported - Learn more.

Blazor only supports current browser versions. To ensure a seamless experience with MudBlazor, please use an up-to-date web browser. If a browser version is no longer maintained by its publisher, we cannot guarantee compatibility with MudBlazor.

Stats

Alt

Contributing

Thanks for wanting to contribute!
Contributions from the community are what makes MudBlazor successful.

If you are familiar with technologies like C#, Blazor, JavaScript, or CSS, and wish to give something back, please consider submitting a pull request! We try to merge all non-breaking bugfixes and will deliberate the value of new features for the community. Please note there is no guarantee your PR will be merged, so if you want to be sure before investing the work, feel free to contact the team first.

Check out the contribution guidelines to understand our goals and learn more about the internals of the project.

Getting Started

Full installation instructions can be found on our website.
Alternatively use one of our templates from the MudBlazor.Templates repo.

Quick Installation Guide

Install Package

dotnet add package MudBlazor

Add the following to _Imports.razor

@using MudBlazor

Add the following to the MainLayout.razor or App.razor

<MudThemeProvider/><MudPopoverProvider/><MudDialogProvider/><MudSnackbarProvider/>

Add the following to index.html (client-side) or _Host.cshtml (server-side) in the head

<link href="https://fonts.googleapis.com/css?family=Roboto:300,400,500,700&display=swap" rel="stylesheet" /><link href="_content/MudBlazor/MudBlazor.min.css" rel="stylesheet" />

Add the following to index.html or _Host.cshtml in the body

<script src="_content/MudBlazor/MudBlazor.min.js"></script>

Add the following to the relevant sections of Program.cs

using MudBlazor.Services;
builder.Services.AddMudServices();

Usage

<MudText Typo="Typo.h6">MudBlazor is @Text</MudText><MudButton Variant="Variant.Filled" Color="Color.Primary" OnClick="ButtonOnClick">@ButtonText</MudButton>@code { public string Text { get; set; } = "????"; public string ButtonText { get; set; } = "Click Me"; public int ButtonClicked { get; set; } void ButtonOnClick() { ButtonClicked += 1; Text = $"Awesome x {ButtonClicked}"; ButtonText = "Click Me Again"; }}