InControl.Core 1.1.0

There is a newer version of this package available.
See the version list below for details.
dotnet add package InControl.Core --version 1.1.0
                    
NuGet\Install-Package InControl.Core -Version 1.1.0
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="InControl.Core" Version="1.1.0" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="InControl.Core" Version="1.1.0" />
                    
Directory.Packages.props
<PackageReference Include="InControl.Core" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add InControl.Core --version 1.1.0
                    
#r "nuget: InControl.Core, 1.1.0"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package InControl.Core@1.1.0
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=InControl.Core&version=1.1.0
                    
Install as a Cake Addin
#tool nuget:?package=InControl.Core&version=1.1.0
                    
Install as a Cake Tool

InControl-Desktop

Local AI Chat Assistant for Windows

A privacy-first, GPU-accelerated chat application that runs large language models entirely on your machine. No cloud required.

Why InControl?

  • Private by default - Your conversations never leave your computer
  • RTX-optimized - Built for NVIDIA GPUs with CUDA acceleration
  • Native Windows experience - WinUI 3 with Fluent Design
  • Multiple backends - Ollama, llama.cpp, or bring your own
  • Markdown rendering - Rich text, code blocks, and syntax highlighting

NuGet Packages

The core libraries are available as standalone NuGet packages for building your own local AI integrations:

Package Description
InControl.Core Domain models, conversation types, and shared abstractions for local AI chat applications.
InControl.Inference LLM backend abstraction layer with streaming chat, model management, and health checks. Includes Ollama implementation.
// Example: use InControl.Inference in your own app
var client = inferenceClientFactory.Create("ollama");
await foreach (var token in client.StreamChatAsync(messages))
{
    Console.Write(token);
}

Target Hardware

Component Minimum Recommended
GPU RTX 3060 (8GB) RTX 4080/5080 (16GB)
RAM 16GB 32GB
OS Windows 10 1809+ Windows 11
.NET 9.0 9.0

Installation

  1. Download the latest MSIX package from Releases
  2. Double-click to install
  3. Launch from Start Menu

From Source

# Clone and build
git clone https://github.com/mcp-tool-shop-org/InControl-Desktop.git
cd InControl-Desktop
dotnet restore
dotnet build

# Run (requires Ollama running locally)
dotnet run --project src/InControl.App

Prerequisites

InControl requires a local LLM backend. We recommend Ollama:

# Install Ollama from https://ollama.ai/download

# Pull a model
ollama pull llama3.2

# Start the server (runs on http://localhost:11434)
ollama serve

Building

Verify Build Environment

# Run verification script
./scripts/verify.ps1

Development Build

dotnet build

Release Build

# Creates release artifacts in artifacts/
./scripts/release.ps1

Run Tests

dotnet test

Architecture

InControl follows a clean, layered architecture:

+-------------------------------------------+
|         InControl.App (WinUI 3)           |  UI Layer
+-------------------------------------------+
|         InControl.ViewModels              |  Presentation
+-------------------------------------------+
|         InControl.Services                |  Business Logic
+-------------------------------------------+
|         InControl.Inference               |  LLM Backends
+-------------------------------------------+
|         InControl.Core                    |  Shared Types
+-------------------------------------------+

See ARCHITECTURE.md for detailed design documentation.

Data Storage

All data is stored locally:

Data Location
Sessions %LOCALAPPDATA%\InControl\sessions\
Logs %LOCALAPPDATA%\InControl\logs\
Cache %LOCALAPPDATA%\InControl\cache\
Exports %USERPROFILE%\Documents\InControl\exports\

See PRIVACY.md for complete data handling documentation.

Troubleshooting

Common issues and solutions are documented in TROUBLESHOOTING.md.

Quick Fixes

App won't start:

  • Check that .NET 9.0 Runtime is installed
  • Run dotnet --list-runtimes to verify

No models available:

  • Ensure Ollama is running: ollama serve
  • Pull a model: ollama pull llama3.2

GPU not detected:

  • Update NVIDIA drivers to latest version
  • Check CUDA toolkit installation

Contributing

Contributions welcome! Please:

  1. Fork the repository
  2. Create a feature branch
  3. Write tests for new functionality
  4. Submit a pull request

Reporting Issues

  1. Check TROUBLESHOOTING.md first
  2. Use the "Copy Diagnostics" feature in the app
  3. Open an issue with diagnostics info attached

Tech Stack

Layer Technology
UI Framework WinUI 3 (Windows App SDK 1.6)
Architecture MVVM with CommunityToolkit.Mvvm
LLM Integration OllamaSharp, Microsoft.Extensions.AI
DI Container Microsoft.Extensions.DependencyInjection
Configuration Microsoft.Extensions.Configuration
Logging Microsoft.Extensions.Logging + Serilog

Version

Current version: 0.4.0-alpha

See CHANGELOG.md for release history.

License

MIT


Built for Windows. Powered by local AI.

Product Compatible and additional computed target framework versions.
.NET net9.0 is compatible.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (1)

Showing the top 1 NuGet packages that depend on InControl.Core:

Package Downloads
InControl.Inference

LLM backend abstraction layer for local inference. Provides IInferenceClient and IModelManager interfaces with streaming chat, model management, and health checks. Includes Ollama implementation out of the box.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
1.2.2 106 2/26/2026
1.2.0 98 2/12/2026
1.1.0 92 2/12/2026
1.0.0 109 2/12/2026