Microsoft.ML.Tokenizers
2.0.0-preview.1.25127.4
Prefix Reserved
dotnet add package Microsoft.ML.Tokenizers --version 2.0.0-preview.1.25127.4
NuGet\Install-Package Microsoft.ML.Tokenizers -Version 2.0.0-preview.1.25127.4
<PackageReference Include="Microsoft.ML.Tokenizers" Version="2.0.0-preview.1.25127.4" />
<PackageVersion Include="Microsoft.ML.Tokenizers" Version="2.0.0-preview.1.25127.4" />
<PackageReference Include="Microsoft.ML.Tokenizers" />
paket add Microsoft.ML.Tokenizers --version 2.0.0-preview.1.25127.4
#r "nuget: Microsoft.ML.Tokenizers, 2.0.0-preview.1.25127.4"
#addin nuget:?package=Microsoft.ML.Tokenizers&version=2.0.0-preview.1.25127.4&prerelease
#tool nuget:?package=Microsoft.ML.Tokenizers&version=2.0.0-preview.1.25127.4&prerelease
About
Microsoft.ML.Tokenizers provides an abstraction for tokenizers as well as implementations of common tokenization algorithms.
Key Features
- Extensible tokenizer architecture that allows for specialization of Normalizer, PreTokenizer, Model/Encoder, Decoder
- BPE - Byte pair encoding model
- English Roberta model
- Tiktoken model
- Llama model
- Phi2 model
How to Use
using Microsoft.ML.Tokenizers;
using System.IO;
using System.Net.Http;
//
// Using Tiktoken Tokenizer
//
// Initialize the tokenizer for the `gpt-4o` model. This instance should be cached for all subsequent use.
Tokenizer tokenizer = TiktokenTokenizer.CreateForModel("gpt-4o");
string source = "Text tokenization is the process of splitting a string into a list of tokens.";
Console.WriteLine($"Tokens: {tokenizer.CountTokens(source)}");
// prints: Tokens: 16
var trimIndex = tokenizer.GetIndexByTokenCountFromEnd(source, 5, out string processedText, out _);
Console.WriteLine($"5 tokens from end: {processedText.Substring(trimIndex)}");
// prints: 5 tokens from end: a list of tokens.
trimIndex = tokenizer.GetIndexByTokenCount(source, 5, out processedText, out _);
Console.WriteLine($"5 tokens from start: {processedText.Substring(0, trimIndex)}");
// prints: 5 tokens from start: Text tokenization is the
IReadOnlyList<int> ids = tokenizer.EncodeToIds(source);
Console.WriteLine(string.Join(", ", ids));
// prints: 1199, 4037, 2065, 374, 279, 1920, 315, 45473, 264, 925, 1139, 264, 1160, 315, 11460, 13
//
// Using Llama Tokenizer
//
// Open a stream to the remote Llama tokenizer model data file.
using HttpClient httpClient = new();
const string modelUrl = @"https://huggingface.co/hf-internal-testing/llama-tokenizer/resolve/main/tokenizer.model";
using Stream remoteStream = await httpClient.GetStreamAsync(modelUrl);
// Create the Llama tokenizer using the remote stream. This should be cached for all subsequent use.
Tokenizer llamaTokenizer = LlamaTokenizer.Create(remoteStream);
string input = "Hello, world!";
ids = llamaTokenizer.EncodeToIds(input);
Console.WriteLine(string.Join(", ", ids));
// prints: 1, 15043, 29892, 3186, 29991
Console.WriteLine($"Tokens: {llamaTokenizer.CountTokens(input)}");
// prints: Tokens: 5
Main Types
The main types provided by this library are:
Microsoft.ML.Tokenizers.Tokenizer
Microsoft.ML.Tokenizers.BpeTokenizer
Microsoft.ML.Tokenizers.EnglishRobertaTokenizer
Microsoft.ML.Tokenizers.TiktokenTokenizer
Microsoft.ML.Tokenizers.Normalizer
Microsoft.ML.Tokenizers.PreTokenizer
Additional Documentation
Related Packages
Feedback & Contributing
Microsoft.ML.Tokenizers is released as open source under the MIT license. Bug reports and contributions are welcome at the GitHub repository.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Google.Protobuf (>= 3.27.1)
- Microsoft.Bcl.HashCode (>= 6.0.0)
- Microsoft.Bcl.Memory (>= 9.0.0)
- System.Text.Json (>= 8.0.5)
-
net8.0
- Google.Protobuf (>= 3.27.1)
- System.Text.Json (>= 8.0.5)
NuGet packages (17)
Showing the top 5 NuGet packages that depend on Microsoft.ML.Tokenizers:
Package | Downloads |
---|---|
Microsoft.ML.Tokenizers.Data.O200kBase
The Microsoft.ML.Tokenizers.Data.O200kBase includes the Tiktoken tokenizer data file o200k_base.tiktoken, which is utilized by models such as gpt-4o. |
|
Microsoft.ML.TorchSharp
Microsoft.ML.TorchSharp contains ML.NET integration of TorchSharp. |
|
Microsoft.ML.Tokenizers.Data.Cl100kBase
The Microsoft.ML.Tokenizers.Data.Cl100kBase class includes the Tiktoken tokenizer data file cl100k_base.tiktoken, which is utilized by models such as GPT-4. |
|
Microsoft.ML.Tokenizers.Data.P50kBase
The Microsoft.ML.Tokenizers.Data.P50kBase includes the Tiktoken tokenizer data file p50k_base.tiktoken, which is utilized by models such as text-davinci-002 |
|
Microsoft.Extensions.AI.Evaluation
A library containing core abstractions for evaluating responses received from an LLM. |
GitHub repositories (13)
Showing the top 13 popular GitHub repositories that depend on Microsoft.ML.Tokenizers:
Repository | Stars |
---|---|
microsoft/semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps
|
|
dotnet/extensions
This repository contains a suite of libraries that provide facilities commonly needed when creating production-ready applications.
|
|
microsoft/WhatTheHack
A collection of challenge based hack-a-thons including student guide, coach guide, lecture presentations, sample/instructional code and templates. Please visit the What The Hack website at: https://aka.ms/wth
|
|
dotnet/ResXResourceManager
Manage localization of all ResX-Based resources in one central place.
|
|
microsoft/ai-dev-gallery
An open-source project for Windows developers to learn how to add AI with local models and APIs to Windows apps.
|
|
axzxs2001/Asp.NetCoreExperiment
原来所有项目都移动到**OleVersion**目录下进行保留。新的案例装以.net 5.0为主,一部分对以前案例进行升级,一部分将以前的工作经验总结出来,以供大家参考!
|
|
sdcb/chats
User-friendly Enterprise AI Interface (Supports Ollama, OpenAI API, ...)
|
|
PowerShell/AIShell
An interactive shell to work with AI-powered assistance providers
|
|
Azure-Samples/cosmosdb-chatgpt
Sample application that combines Azure Cosmos DB with Azure OpenAI ChatGPT service
|
|
dmitry-brazhenko/SharpToken
SharpToken is a C# library for tokenizing natural language text. It's based on the tiktoken Python library and designed to be fast and accurate.
|
|
lindexi/lindexi_gd
博客用到的代码
|
|
Azure/Vector-Search-AI-Assistant
Microsoft Official Build Modern AI Apps reference solutions and content. Demonstrate how to build Copilot applications that incorporate Hero Azure Services including Azure OpenAI Service, Azure Container Apps (or AKS) and Azure Cosmos DB for NoSQL with Vector Search.
|
|
alnkesq/AppViewLite
A Bluesky appview focused on low resource consumption
|
Version | Downloads | Last updated |
---|---|---|
2.0.0-preview.1.25127.4 | 4,879 | 2/28/2025 |
2.0.0-preview.1.25125.4 | 235 | 2/25/2025 |
1.0.2 | 21,612 | 2/26/2025 |
1.0.1 | 98,464 | 1/15/2025 |
1.0.0 | 231,202 | 11/14/2024 |
0.22.0 | 12,516 | 11/13/2024 |
0.22.0-preview.24526.1 | 3,253 | 10/27/2024 |
0.22.0-preview.24522.7 | 3,146 | 10/23/2024 |
0.22.0-preview.24378.1 | 232,392 | 7/29/2024 |
0.22.0-preview.24271.1 | 174,213 | 5/21/2024 |
0.22.0-preview.24179.1 | 167,779 | 4/2/2024 |
0.22.0-preview.24162.2 | 21,126 | 3/13/2024 |
0.21.1 | 119,095 | 1/18/2024 |
0.21.0 | 54,239 | 11/27/2023 |
0.21.0-preview.23511.1 | 52,360 | 10/13/2023 |
0.21.0-preview.23266.6 | 52,038 | 5/17/2023 |
0.21.0-preview.22621.2 | 2,227 | 12/22/2022 |
0.20.1 | 100,031 | 2/1/2023 |
0.20.1-preview.22573.9 | 2,693 | 11/24/2022 |
0.20.0 | 33,982 | 11/8/2022 |
0.20.0-preview.22551.1 | 269 | 11/1/2022 |