DoIt.S3Client 1.1.0

dotnet add package DoIt.S3Client --version 1.1.0
NuGet\Install-Package DoIt.S3Client -Version 1.1.0
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="DoIt.S3Client" Version="1.1.0" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add DoIt.S3Client --version 1.1.0
#r "nuget: DoIt.S3Client, 1.1.0"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install DoIt.S3Client as a Cake Addin
#addin nuget:?package=DoIt.S3Client&version=1.1.0

// Install DoIt.S3Client as a Cake Tool
#tool nuget:?package=DoIt.S3Client&version=1.1.0

do IT S3 Client

NuGet Badge

A simple S3 (Simple Storage Service) client with support for uploading objects of unknown size.

Why on earth would I want to upload objects of unknown size?!

Well, you might, for example, want to compress or encrypt (or both!) a file/stream on the fly while uploading it to you S3 bucket.

Good point! How would I do that?

using DoIt.S3Client;

// Create a client, the main interface to you S3 bucket.
using var client = new S3Client(new Uri("https://your-bucket-endpoint"), "your-region", "access-key", "secret-key");

// Open a stream for uploading data of (beforehand) unknown size, such as a ZIP archive created on the fly.
using (var zip = new ZipArchive(await client.OpenObjectForWritingAsync("test.zip", "application/zip"), ZipArchiveMode.Create))
{
    // Dynamically add data to you ZIP archive until you've written all you need to write.
    var entry = zip.CreateEntry("test.txt");
    using var writer = new StreamWriter(entry.Open());
    await writer.WriteLineAsync("Hello, this is a test object.");
}

How does it work?

The client initiates a multipart upload when an S3 stream is created. The "current" part is held in memory and sent to your S3 bucket only when either the current part reaches the maximum part size or the stream is closed/disposed.

Even if you upload very large files to your S3 bucket, only a single part will be held in memory at any time.

What other operations does the client support?

The primary rationale for this client is to be able to upload objects of unknown sizes in a simple way. The support for other operations is rather limited. The only supported operations are

  • Upload an object (OpenObjectForWritingAsync())
  • Download an object (OpenObjectForReadingAsync())
  • Get an object's metadata (GetObjectMetadataAsync())
  • Delete an object (DeleteObjectAsync())

Specifically, bucket operations (for example, creating a bucket) is not supported by this client.

Product Compatible and additional computed target framework versions.
.NET net7.0 is compatible.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
1.1.0 251 8/17/2023
1.0.1 183 6/13/2023
1.0.0 132 6/9/2023