UnderOcean 1.0.0
See the version list below for details.
dotnet add package UnderOcean --version 1.0.0
NuGet\Install-Package UnderOcean -Version 1.0.0
<PackageReference Include="UnderOcean" Version="1.0.0" />
<PackageVersion Include="UnderOcean" Version="1.0.0" />
<PackageReference Include="UnderOcean" />
paket add UnderOcean --version 1.0.0
#r "nuget: UnderOcean, 1.0.0"
#:package UnderOcean@1.0.0
#addin nuget:?package=UnderOcean&version=1.0.0
#tool nuget:?package=UnderOcean&version=1.0.0
UnderOcean
A powerful .NET library for AI conversation management with Ollama integration, featuring mixture of experts routing, embedding services, and intelligent conversation handling.
Features
- Ollama Integration: Seamless integration with Ollama for local AI model execution
- Mixture of Experts (MoE) Routing: Intelligent routing to select the best model for each query
- Embedding Services: Generate and work with text embeddings for semantic search
- Conversation Management: Maintain conversation context and history
- Tool Support: Extensible tool system for function calling
- Performance Monitoring: Built-in performance tracking and monitoring
Installation
dotnet add package UnderOcean
Quick Start
Setup Dependency Injection
using Microsoft.Extensions.DependencyInjection;
using UnderOcean.Abstractions;
using UnderOcean.Services;
var services = new ServiceCollection();
services.AddHttpClient();
services.AddSingleton<IOllamaClient>(sp => new OllamaClient(sp.GetRequiredService<HttpClient>()));
services.AddSingleton<IEmbeddingService, EmbeddingService>();
services.AddSingleton<IMoERouter>(sp => new MoERouter(sp.GetRequiredService<IOllamaClient>()));
services.AddSingleton<IChatService, ChatService>();
services.AddSingleton<IConversationManager, ConversationManager>();
var provider = services.BuildServiceProvider();
Basic Conversation
var conversationManager = provider.GetRequiredService<IConversationManager>();
var request = new ConversationRequest(
UserMessage: "Hello, how are you?",
SystemPrompt: "You are a helpful assistant.",
ConversationHistory: new List<ChatMessage>(),
KnowledgeBase: new List<EmbeddingContext>(),
AvailableTools: new List<ToolSchema>(),
Mode: ConversationMode.Auto
);
var response = await conversationManager.ProcessAsync(request, CancellationToken.None);
Console.WriteLine($"Response: {response.Response}");
Working with Embeddings
var embeddingService = provider.GetRequiredService<IEmbeddingService>();
var texts = new[] { "Hello world", "How are you?" };
var embeddings = await embeddingService.EmbedAsync(texts, CancellationToken.None);
// Use embeddings for semantic search or similarity comparison
Tool Integration
var tools = new List<ToolSchema>
{
new()
{
Function = new ToolFunction
{
Name = "get_weather",
Description = "Get current weather information",
Parameters = new
{
type = "object",
properties = new
{
city = new { type = "string", description = "The city name" }
},
required = new[] { "city" }
}
}
}
};
var request = new ConversationRequest(
UserMessage: "What's the weather in New York?",
SystemPrompt: "You are a helpful assistant with access to weather data.",
ConversationHistory: new List<ChatMessage>(),
KnowledgeBase: new List<EmbeddingContext>(),
AvailableTools: tools,
Mode: ConversationMode.Auto
);
Core Components
IOllamaClient
Interface for communicating with Ollama API endpoints.
IEmbeddingService
Service for generating text embeddings using Ollama models.
IMoERouter
Mixture of Experts router that selects the best model for each query based on content analysis.
IConversationManager
High-level service for managing conversations, including context, tools, and routing.
IChatService
Core chat functionality for sending messages and receiving responses.
Configuration
The library supports various configuration options through the service constructors and request parameters:
- Model Selection: Automatic or manual model selection
- Conversation Modes: Auto, Direct, or Custom routing
- Tool Integration: Extensible tool system for function calling
- Knowledge Base: Semantic search integration with embeddings
Requirements
- .NET 8.0 or later
- Ollama running locally or accessible via network
- Compatible Ollama models installed
License
MIT License - see LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- Microsoft.Extensions.DependencyInjection (>= 8.0.0)
- Microsoft.Extensions.Http (>= 8.0.0)
- System.Net.Http.Json (>= 8.0.0)
- System.Text.Json (>= 8.0.5)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.