DataPackers 1.0.0
dotnet add package DataPackers --version 1.0.0
NuGet\Install-Package DataPackers -Version 1.0.0
<PackageReference Include="DataPackers" Version="1.0.0" />
<PackageVersion Include="DataPackers" Version="1.0.0" />
<PackageReference Include="DataPackers" />
paket add DataPackers --version 1.0.0
#r "nuget: DataPackers, 1.0.0"
#:package DataPackers@1.0.0
#addin nuget:?package=DataPackers&version=1.0.0
#tool nuget:?package=DataPackers&version=1.0.0
DataPackers - High-Performance PAK File System for .NET
DataPackers is a high-performance resource packaging library for .NET that provides game-engine-style PAK file functionality. It enables efficient bundling, compression, encryption, and on-demand loading of multiple files into a single archive.
Features
Core Capabilities
- 📦 File Packaging: Bundle multiple files and directories into a single PAK archive
- 🚀 On-Demand Loading: Stream-based reading without loading entire archive into memory
- 🗜️ Compression: Support for Deflate, GZip, and Brotli compression algorithms
- 🔒 Encryption: AES-256-CBC encryption for secure asset protection
- ✅ Integrity Verification: SHA-256 hash validation to detect corruption
- ⚡ High Performance: Zero-allocation APIs using
Span<T>andMemory<T> - 🔄 Thread-Safe: Concurrent reading of different files from multiple threads
- 💾 Memory Efficient: ArrayPool-based buffer management to reduce GC pressure
Performance Optimizations
- Zero-allocation file reading with
Span<byte>andReadOnlySpan<char> - Struct-based data structures to minimize heap allocations
ArrayPool<T>for efficient buffer reuse- Stack allocation (
stackalloc) for small temporary buffers - String interning for file name deduplication
- Optional LRU caching for frequently accessed files
- Buffered I/O for optimal disk access patterns
Installation
# Add reference to your project
dotnet add reference DataPackers/DataPackers.csproj
Or add directly to your .csproj:
<ItemGroup>
<ProjectReference Include="..\DataPackers\DataPackers.csproj" />
</ItemGroup>
Quick Start
Basic Packing
using DataPackers;
// Create packer with default options
var options = new PakPackerOptions
{
CalculateHash = true
};
using var packer = new PakPacker(options);
// Pack entire directory
packer.PackDirectory(@"C:\MyAssets", @"C:\Output\assets.pak");
Basic Reading
using DataPackers;
// Open PAK file
using var reader = new PakReader(@"C:\Output\assets.pak");
// Check if file exists
if (reader.FileExists("config.json"))
{
// Read as text
string config = reader.ReadAllText("config.json");
// Read as bytes
byte[] data = reader.ReadAllBytes("image.png");
// Stream large files
using var stream = reader.OpenFile("video.mp4");
// Process stream...
}
Usage Examples
Compression
Reduce PAK file size with compression:
var options = new PakPackerOptions
{
EnableCompression = true,
CompressionAlgorithm = CompressionAlgorithm.Brotli,
CompressionLevel = 6 // 0-9, higher = better compression
};
using var packer = new PakPacker(options);
packer.PackDirectory(@"C:\Assets", @"C:\Output\compressed.pak");
Reading compressed files is automatic:
using var reader = new PakReader(@"C:\Output\compressed.pak");
string text = reader.ReadAllText("file.txt"); // Automatically decompressed
Encryption
Protect your assets with AES-256 encryption:
// Generate encryption key (32 bytes for AES-256)
byte[] key = new byte[32];
using var rng = System.Security.Cryptography.RandomNumberGenerator.Create();
rng.GetBytes(key);
// Pack with encryption
var packOptions = new PakPackerOptions
{
EnableEncryption = true,
EncryptionKey = key
};
using var packer = new PakPacker(packOptions);
packer.PackDirectory(@"C:\SecureAssets", @"C:\Output\encrypted.pak");
// Read with decryption
var readOptions = new PakReaderOptions
{
DecryptionKey = key
};
using var reader = new PakReader(@"C:\Output\encrypted.pak", readOptions);
string data = reader.ReadAllText("secret.txt");
Password-Based Encryption
Derive encryption key from a password:
using System.Security.Cryptography;
// Derive key from password
string password = "MySecurePassword123!";
byte[] salt = new byte[16];
RandomNumberGenerator.Fill(salt);
using var pbkdf2 = new Rfc2898DeriveBytes(
password,
salt,
100000,
HashAlgorithmName.SHA256
);
byte[] key = pbkdf2.GetBytes(32);
// Use key for packing/reading
var options = new PakPackerOptions
{
EnableEncryption = true,
EncryptionKey = key
};
Integrity Verification
Detect file corruption with SHA-256 hashing:
// Pack with hash calculation
var packOptions = new PakPackerOptions
{
CalculateHash = true
};
using var packer = new PakPacker(packOptions);
packer.PackDirectory(@"C:\Assets", @"C:\Output\verified.pak");
// Read with verification
var readOptions = new PakReaderOptions
{
VerifyHash = true // Throws PakIntegrityException if corrupted
};
using var reader = new PakReader(@"C:\Output\verified.pak", readOptions);
try
{
byte[] data = reader.ReadAllBytes("important.dat");
// Data integrity verified ✓
}
catch (PakIntegrityException ex)
{
Console.WriteLine($"File corrupted: {ex.FileName}");
}
Streaming Large Files
Memory-efficient reading of large files:
using var reader = new PakReader(@"C:\Output\assets.pak");
// Open as stream
using var stream = reader.OpenFile("large_video.mp4");
Console.WriteLine($"File size: {stream.Length} bytes");
// Read in chunks
byte[] buffer = new byte[8192];
while (true)
{
int bytesRead = stream.Read(buffer, 0, buffer.Length);
if (bytesRead == 0) break;
// Process chunk...
}
// Seeking is supported
stream.Seek(1024, SeekOrigin.Begin);
stream.Seek(-512, SeekOrigin.End);
Performance Caching
Cache frequently accessed files in memory:
var options = new PakReaderOptions
{
EnableCache = true,
CacheMaxSize = 50 * 1024 * 1024 // 50 MB cache
};
using var reader = new PakReader(@"C:\Output\assets.pak", options);
// First read - loads from disk
byte[] data1 = reader.ReadAllBytes("config.json");
// Second read - loads from cache (much faster!)
byte[] data2 = reader.ReadAllBytes("config.json");
Zero-Allocation Reading
High-performance reading with no heap allocations:
using var reader = new PakReader(@"C:\Output\assets.pak");
// Pre-allocate buffer
byte[] buffer = new byte[1024];
// Read directly into buffer (zero allocations)
int bytesRead = reader.ReadInto("file.txt", buffer.AsSpan());
// Process buffer[0..bytesRead]
Async Operations
Asynchronous file reading:
using var reader = new PakReader(@"C:\Output\assets.pak");
// Async read
byte[] data = await reader.ReadAllBytesAsync("file.dat");
// Async text read
string text = await reader.ReadAllTextAsync("readme.txt");
// Async with cancellation
var cts = new CancellationTokenSource();
byte[] result = await reader.ReadAllBytesAsync("large.bin", cts.Token);
File Extraction
Extract files from PAK to disk:
using var reader = new PakReader(@"C:\Output\assets.pak");
// Extract single file
reader.ExtractFile("config.json", @"C:\Extracted\config.json");
// Extract all files (preserves directory structure)
reader.ExtractAll(@"C:\Extracted");
File Filtering
Pack only specific file types:
var options = new PakPackerOptions
{
FileFilter = filePath =>
{
string ext = Path.GetExtension(filePath).ToLowerInvariant();
return ext == ".json" || ext == ".xml" || ext == ".txt";
}
};
using var packer = new PakPacker(options);
packer.PackDirectory(@"C:\Assets", @"C:\Output\configs.pak");
File Metadata
Query file information:
using var reader = new PakReader(@"C:\Output\assets.pak");
// Get all file names
foreach (string fileName in reader.GetAllFileNames())
{
Console.WriteLine(fileName);
}
// Get detailed file entry
var entry = reader.GetFileEntry("image.png");
Console.WriteLine($"Original Size: {entry.OriginalSize}");
Console.WriteLine($"Compressed Size: {entry.CompressedSize}");
Console.WriteLine($"Is Compressed: {entry.IsCompressed}");
Console.WriteLine($"Is Encrypted: {entry.IsEncrypted}");
Console.WriteLine($"Compression Algorithm: {entry.CompressionAlgorithm}");
API Reference
PakPacker
Main class for creating PAK archives.
public class PakPacker : IDisposable
{
public PakPacker(PakPackerOptions options);
// Pack entire directory
public void PackDirectory(string sourceDirectory, string outputPakPath);
// Pack specific files
public void PackFiles(IEnumerable<string> filePaths, string outputPakPath,
string? baseDirectory = null);
}
PakPackerOptions
Configuration for packing operations.
public class PakPackerOptions
{
public bool EnableCompression { get; set; }
public CompressionAlgorithm CompressionAlgorithm { get; set; }
public int CompressionLevel { get; set; } // 0-9
public bool EnableEncryption { get; set; }
public ReadOnlyMemory<byte> EncryptionKey { get; set; }
public bool CalculateHash { get; set; }
public int BufferSize { get; set; }
public Func<string, bool>? FileFilter { get; set; }
public bool UseArrayPool { get; set; }
}
PakReader
Main class for reading from PAK archives.
public class PakReader : IDisposable
{
public PakReader(string pakFilePath, PakReaderOptions? options = null);
public int FileCount { get; }
// Query methods
public bool FileExists(string fileName);
public IReadOnlyCollection<string> GetAllFileNames();
public FileEntry GetFileEntry(string fileName);
// Read methods
public byte[] ReadAllBytes(string fileName);
public string ReadAllText(string fileName, Encoding? encoding = null);
public Stream OpenFile(string fileName);
// Zero-allocation methods
public int ReadInto(string fileName, Span<byte> buffer);
public Memory<byte> ReadAsMemory(string fileName);
// Async methods
public Task<byte[]> ReadAllBytesAsync(string fileName,
CancellationToken cancellationToken = default);
public Task<string> ReadAllTextAsync(string fileName, Encoding? encoding = null,
CancellationToken cancellationToken = default);
// Extraction methods
public void ExtractFile(string fileName, string outputPath);
public void ExtractAll(string outputDirectory);
}
PakReaderOptions
Configuration for reading operations.
public class PakReaderOptions
{
public ReadOnlyMemory<byte> DecryptionKey { get; set; }
public bool VerifyHash { get; set; }
public bool EnableCache { get; set; }
public int CacheMaxSize { get; set; }
public bool UseArrayPool { get; set; }
}
CompressionAlgorithm
Supported compression algorithms.
public enum CompressionAlgorithm : byte
{
None = 0,
Deflate = 1,
GZip = 2,
Brotli = 3
}
Exceptions
Custom exception types for error handling.
public class PakException : Exception { }
public class PakFormatException : PakException { }
public class PakEncryptionException : PakException { }
public class PakIntegrityException : PakException
{
public string FileName { get; }
}
PAK File Format
The PAK file uses a custom binary format:
┌─────────────────────────────────────────┐
│ PAK File Header (36 bytes) │
│ - Magic Number: "PAK\0" (4 bytes) │
│ - Version: uint32 (4 bytes) │
│ - Index Offset: long (8 bytes) │
│ - Index Size: int (4 bytes) │
│ - File Count: int (4 bytes) │
│ - Flags: uint32 (4 bytes) │
│ - Reserved: 8 bytes │
├─────────────────────────────────────────┤
│ File Data Blocks │
│ - File 1 Data (variable size) │
│ - File 2 Data (variable size) │
│ - ... │
├─────────────────────────────────────────┤
│ File Index Section │
│ For each file: │
│ - File Name Length (4 bytes) │
│ - File Name (UTF-8 string) │
│ - Offset (8 bytes) │
│ - Original Size (8 bytes) │
│ - Compressed Size (8 bytes) │
│ - Hash (32 bytes, SHA-256) │
│ - Flags (4 bytes) │
└─────────────────────────────────────────┘
Performance Characteristics
Memory Usage
- Index Loading: ~200 bytes per file entry
- 10,000 files: ~2 MB index memory
- Streaming: <10 MB for 100 MB file reads
- Zero-allocation APIs: No heap allocations for reads
Speed Benchmarks
- Index Loading: <100ms for 10,000 files
- Small File Read (<1KB): <1ms
- Large File Stream: Constant memory usage
- Concurrent Reads: Linear scaling with thread count
Compression Ratios (typical)
- Text files: 60-80% reduction
- JSON/XML: 70-85% reduction
- Binary data: 10-30% reduction
- Already compressed: Minimal change
Thread Safety
PakReader is designed for concurrent access:
- ✅ Multiple threads can read different files simultaneously
- ✅ Internal synchronization ensures thread-safe stream access
- ✅ No external locking required
- ⚠️
PakPackeris not thread-safe (single-threaded packing only)
Best Practices
Security
- Never embed encryption keys in code - Use secure key storage
- Use strong passwords for key derivation (12+ characters)
- Store salt separately from encrypted PAK files
- Enable hash verification for critical assets
- Use AES-256 (32-byte keys) for maximum security
Performance
- Enable ArrayPool for reduced GC pressure
- Use caching for frequently accessed small files
- Stream large files instead of loading entirely
- Use zero-allocation APIs in hot paths
- Pack with compression for network distribution
Organization
- Group related assets in separate PAK files
- Use file filters to create specialized archives
- Preserve directory structure for maintainability
- Document encryption keys and their storage locations
- Version your PAK files for update management
Requirements
- .NET 8.0 or higher
- C# 12 language features
- Unsafe code enabled (for
Hash256struct)
Project Structure
DataPackers/
├── DataPackers/ # Main library
│ ├── PakPacker.cs # Packing functionality
│ ├── PakReader.cs # Reading functionality
│ ├── PakStream.cs # Virtual file stream
│ ├── CompressionHandler.cs
│ ├── EncryptionHandler.cs
│ ├── IntegrityValidator.cs
│ ├── FileCache.cs
│ └── ...
├── DataPackers.Tests/ # Unit and integration tests
├── Examples/ # Usage examples
│ ├── BasicUsageExample.cs
│ └── AdvancedFeaturesExample.cs
└── README.md # This file
Testing
Run the test suite:
cd DataPackers.Tests
dotnet test
Test coverage includes:
- Unit tests for all core components
- Integration tests for end-to-end workflows
- Performance tests for benchmarking
- Compression algorithm tests
- Encryption/decryption tests
- Integrity verification tests
- Concurrent access tests
Examples
Complete working examples are available in the Examples/ directory:
- BasicUsageExample.cs: Simple packing and reading operations
- AdvancedFeaturesExample.cs: Compression, encryption, and security features
Run examples:
cd Examples
dotnet run
License
This project is licensed under the MIT License - see the LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit issues and pull requests.
Acknowledgments
- Inspired by game engine PAK file systems (Unreal Engine, Unity)
- Uses .NET's high-performance APIs (
Span<T>,Memory<T>,ArrayPool<T>) - Implements zero-allocation patterns for optimal performance
Support
For questions, issues, or feature requests, please open an issue on the project repository.
DataPackers - Efficient, secure, and high-performance resource packaging for .NET applications.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net9.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 1.0.0 | 152 | 11/2/2025 |
v1.0.0 - Initial Release
- PAK file packing and reading
- Compression support (Deflate, GZip, Brotli)
- AES-256-CBC encryption
- SHA-256 integrity verification
- Zero-allocation APIs with Span<T> and Memory<T>
- Thread-safe concurrent reading
- LRU caching for frequently accessed files
- Stream-based reading for large files
- ArrayPool-based buffer management