InControl.Core
1.0.0
See the version list below for details.
dotnet add package InControl.Core --version 1.0.0
NuGet\Install-Package InControl.Core -Version 1.0.0
<PackageReference Include="InControl.Core" Version="1.0.0" />
<PackageVersion Include="InControl.Core" Version="1.0.0" />
<PackageReference Include="InControl.Core" />
paket add InControl.Core --version 1.0.0
#r "nuget: InControl.Core, 1.0.0"
#:package InControl.Core@1.0.0
#addin nuget:?package=InControl.Core&version=1.0.0
#tool nuget:?package=InControl.Core&version=1.0.0
InControl-Desktop
Local AI Chat Assistant for Windows
A privacy-first, GPU-accelerated chat application that runs large language models entirely on your machine. No cloud required.
Why InControl?
- Private by default - Your conversations never leave your computer
- RTX-optimized - Built for NVIDIA GPUs with CUDA acceleration
- Native Windows experience - WinUI 3 with Fluent Design
- Multiple backends - Ollama, llama.cpp, or bring your own
- Markdown rendering - Rich text, code blocks, and syntax highlighting
NuGet Packages
The core libraries are available as standalone NuGet packages for building your own local AI integrations:
| Package | Description |
|---|---|
InControl.Core |
Domain models, conversation types, and shared abstractions for local AI chat applications. |
InControl.Inference |
LLM backend abstraction layer with streaming chat, model management, and health checks. Includes Ollama implementation. |
// Example: use InControl.Inference in your own app
var client = inferenceClientFactory.Create("ollama");
await foreach (var token in client.StreamChatAsync(messages))
{
Console.Write(token);
}
Target Hardware
| Component | Minimum | Recommended |
|---|---|---|
| GPU | RTX 3060 (8GB) | RTX 4080/5080 (16GB) |
| RAM | 16GB | 32GB |
| OS | Windows 10 1809+ | Windows 11 |
| .NET | 9.0 | 9.0 |
Installation
From Release (Recommended)
- Download the latest MSIX package from Releases
- Double-click to install
- Launch from Start Menu
From Source
# Clone and build
git clone https://github.com/mcp-tool-shop-org/InControl-Desktop.git
cd InControl-Desktop
dotnet restore
dotnet build
# Run (requires Ollama running locally)
dotnet run --project src/InControl.App
Prerequisites
InControl requires a local LLM backend. We recommend Ollama:
# Install Ollama from https://ollama.ai/download
# Pull a model
ollama pull llama3.2
# Start the server (runs on http://localhost:11434)
ollama serve
Building
Verify Build Environment
# Run verification script
./scripts/verify.ps1
Development Build
dotnet build
Release Build
# Creates release artifacts in artifacts/
./scripts/release.ps1
Run Tests
dotnet test
Architecture
InControl follows a clean, layered architecture:
+-------------------------------------------+
| InControl.App (WinUI 3) | UI Layer
+-------------------------------------------+
| InControl.ViewModels | Presentation
+-------------------------------------------+
| InControl.Services | Business Logic
+-------------------------------------------+
| InControl.Inference | LLM Backends
+-------------------------------------------+
| InControl.Core | Shared Types
+-------------------------------------------+
See ARCHITECTURE.md for detailed design documentation.
Data Storage
All data is stored locally:
| Data | Location |
|---|---|
| Sessions | %LOCALAPPDATA%\InControl\sessions\ |
| Logs | %LOCALAPPDATA%\InControl\logs\ |
| Cache | %LOCALAPPDATA%\InControl\cache\ |
| Exports | %USERPROFILE%\Documents\InControl\exports\ |
See PRIVACY.md for complete data handling documentation.
Troubleshooting
Common issues and solutions are documented in TROUBLESHOOTING.md.
Quick Fixes
App won't start:
- Check that .NET 9.0 Runtime is installed
- Run
dotnet --list-runtimesto verify
No models available:
- Ensure Ollama is running:
ollama serve - Pull a model:
ollama pull llama3.2
GPU not detected:
- Update NVIDIA drivers to latest version
- Check CUDA toolkit installation
Contributing
Contributions welcome! Please:
- Fork the repository
- Create a feature branch
- Write tests for new functionality
- Submit a pull request
Reporting Issues
- Check TROUBLESHOOTING.md first
- Use the "Copy Diagnostics" feature in the app
- Open an issue with diagnostics info attached
Tech Stack
| Layer | Technology |
|---|---|
| UI Framework | WinUI 3 (Windows App SDK 1.6) |
| Architecture | MVVM with CommunityToolkit.Mvvm |
| LLM Integration | OllamaSharp, Microsoft.Extensions.AI |
| DI Container | Microsoft.Extensions.DependencyInjection |
| Configuration | Microsoft.Extensions.Configuration |
| Logging | Microsoft.Extensions.Logging + Serilog |
Version
Current version: 0.4.0-alpha
See CHANGELOG.md for release history.
License
MIT
Built for Windows. Powered by local AI.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net9.0
- System.Text.Json (>= 9.0.1)
NuGet packages (1)
Showing the top 1 NuGet packages that depend on InControl.Core:
| Package | Downloads |
|---|---|
|
InControl.Inference
LLM backend abstraction layer for local inference. Provides IInferenceClient and IModelManager interfaces with streaming chat, model management, and health checks. Includes Ollama implementation out of the box. |
GitHub repositories
This package is not used by any popular GitHub repositories.