DataLinq.Spark
1.0.0
See the version list below for details.
dotnet add package DataLinq.Spark --version 1.0.0
NuGet\Install-Package DataLinq.Spark -Version 1.0.0
<PackageReference Include="DataLinq.Spark" Version="1.0.0" />
<PackageVersion Include="DataLinq.Spark" Version="1.0.0" />
<PackageReference Include="DataLinq.Spark" />
paket add DataLinq.Spark --version 1.0.0
#r "nuget: DataLinq.Spark, 1.0.0"
#:package DataLinq.Spark@1.0.0
#addin nuget:?package=DataLinq.Spark&version=1.0.0
#tool nuget:?package=DataLinq.Spark&version=1.0.0
DataLinq.Spark
LINQ-native Apache Spark integration for DataLinq.NET.
📖 LINQ-to-Spark Guide | DataLinq.NET on GitHub | 🌐 Product Website
Features
- Native LINQ Translation - Write C# LINQ, execute distributed Spark
- Streaming Results - Efficient processing with DataFrames
- Type Safety - Strong typing with automatic column mapping
- Distributed Processing - Scale to petabytes with Apache Spark
- O(1) Memory Writes - Batched streaming for table writes
- Window Functions - Rank, Lead, Lag, running aggregates with expression syntax
- Cases Pattern - Multi-output conditional routing
- In-Memory Push -
context.Push(data)for test data injection
Quick Start
using DataLinq.Spark;
// Connect to Spark (local mode)
using var context = Spark.Connect("local[*]", "MyApp");
// Production cluster examples:
// using var context = Spark.Connect("spark://spark-master:7077", "MyApp");
// using var context = Spark.Connect("yarn", "MyApp");
// Query with LINQ (cluster-side execution)
var stats = context.Read.Table<Order>("sales.orders")
.Where(o => o.Amount > 1000)
.GroupBy(o => o.Region)
.Select(g => new { Region = g.Key, Total = g.Sum(o => o.Amount) })
.ToList();
// Side effects with ForEach (executes on Spark executors - NOT locally!)
int processed = 0;
context.Read.Table<Order>("sales.orders")
.ForEach(o => processed++)
.Do(); // ← Triggers distributed execution; field sync-back happens here
Console.WriteLine($"Processed {processed} orders");
Write Operations
using DataLinq.Spark; // SaveMode is included — no extra imports needed
// From SparkQuery (server-side)
await context.Read.Table<Order>("orders")
.Where(o => o.Amount > 1000)
.WriteParquet("/output/high_value");
await context.Read.Table<Order>("orders")
.WriteTable("analytics.summary", mode: SaveMode.Overwrite);
// From local IEnumerable (client → server, context required)
await data.WriteTable(context, "orders", mode: SaveMode.Overwrite, bufferSize: 10_000);
await data.WriteParquet(context, "hdfs://data/orders.parquet", bufferSize: 50_000);
Test Coverage
| Tier | Tests | Pass | Fail | Coverage |
|---|---|---|---|---|
| Unit Tests | 122 | 122 | 0 | 100% |
| Integration Tests | 250 | 250 | 0 | 100% |
| Adversarial Audit | 306 | 306 | 0 | 100% |
| TOTAL | 678 | 678 | 0 | 100% |
Requirements
- .NET 8.0+
- DataLinq.NET 1.0.0+
- Apache Spark 3.5.0+
- DataLinq.Spark license for production
Before You Run
DataLinq.Spark is the developer layer — your DevOps/infra team owns the Spark cluster setup. Before running your application, confirm your Spark backend is reachable:
# Local mode — verify Spark is available on the machine:
spark-submit --version
# Remote/cluster — verify your master URL is reachable:
curl http://spark-master:8080
If Spark.Connect(...) fails immediately, the issue is your Spark environment, not the library. Point your DevOps team to the Apache Spark installation guide.
Execution halts with LicenseException at 1,000 rows
The free Developer Tier strictly caps queries at 1,000 rows. This tier is active by default — no environment variable needed. If your pipeline throws a LicenseException, set a production license key (see License section below).
ForEach syncs field values back to the driver automatically
DataLinq.Spark implements the Delta Reflection Protocol: after the distributed ForEach executes on Spark executors, field mutations are automatically reflected back to the calling machine. Static fields, lambda-captured variables, and instance fields all sync back.
// ✅ Static method — static fields sync back after Do():
query.ForEach(OrderStats.ProcessOrder).Do();
Console.WriteLine(OrderStats.TotalAmount); // ← Updated correctly
// ✅ Lambda closure — captured variables sync back:
int count = 0;
query.ForEach(o => count++).Do();
Console.WriteLine(count); // ← Updated correctly
// ✅ Instance method — instance fields sync back:
var processor = new OrderProcessor();
query.ForEach(processor.Process).Do();
Console.WriteLine(processor.Processed); // ← Updated correctly
Note: Collections (
List<T>, arrays) andstringconcatenation are not synchronized — use numeric/boolean fields for accumulators. The sync uses additive delta merging (+=,++), so conditional patterns likeif (x > max) max = xwill produce incorrect results across partitions. The Roslyn analyzer will warn you at compile time (DFSP001,DFSP002).
Support & Issues
📧 Contact: support@get-datalinq.net
🐛 Report Issues: github.com/improveTheWorld/DataLinq.NET/issues
License
Development Tier (Free)
Use DataLinq.Spark free for development and testing up to 1,000 rows per query.
The Development Tier is active by default — no environment variable needed. Just install the package and start coding.
Production License
For production workloads (unlimited rows), obtain a license at:
- 🌐 Pricing: https://get-datalinq.net/pricing
- 📧 Contact: support@get-datalinq.net
Set your license key as an environment variable (auto-detected at runtime):
# PowerShell
$env:DATALINQ_LICENSE_KEY="your-license-key"
# Bash/Linux/macOS
export DATALINQ_LICENSE_KEY="your-license-key"
# Docker / Kubernetes
ENV DATALINQ_LICENSE_KEY=your-license-key
Security: The license key is never in source code. Set it in your deployment environment (CI/CD secrets, Azure Key Vault, AWS Secrets Manager, etc.)
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- DataLinq.Net (>= 1.0.0)
- Microsoft.Spark (>= 2.3.0)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
v1.0.0: Initial release — LINQ-native Apache Spark integration with Auto-UDF, Delta Reflection (ForEach), Cases pattern, streaming writes, and full math function support.