Azure.Storage.Files.DataLake
12.21.0
Prefix Reserved
dotnet add package Azure.Storage.Files.DataLake --version 12.21.0
NuGet\Install-Package Azure.Storage.Files.DataLake -Version 12.21.0
<PackageReference Include="Azure.Storage.Files.DataLake" Version="12.21.0" />
paket add Azure.Storage.Files.DataLake --version 12.21.0
#r "nuget: Azure.Storage.Files.DataLake, 12.21.0"
// Install Azure.Storage.Files.DataLake as a Cake Addin #addin nuget:?package=Azure.Storage.Files.DataLake&version=12.21.0 // Install Azure.Storage.Files.DataLake as a Cake Tool #tool nuget:?package=Azure.Storage.Files.DataLake&version=12.21.0
Azure Storage Files Data Lake client library for .NET
Server Version: 2021-02-12, 2020-12-06, 2020-10-02, 2020-08-04, 2020-06-12, 2020-04-08, 2020-02-10, 2019-12-12, 2019-07-07, and 2019-02-02
Azure Data Lake includes all the capabilities required to make it easy for developers, data scientists, and analysts to store data of any size, shape, and speed, and do all types of processing and analytics across platforms and languages. It removes the complexities of ingesting and storing all of your data while making it faster to get up and running with batch, streaming, and interactive analytics.
Source code | Package (NuGet) | API reference documentation | REST API documentation | Product documentation
Getting started
Install the package
Install the Azure Storage Files Data Lake client library for .NET with NuGet:
dotnet add package Azure.Storage.Files.DataLake
Prerequisites
You need an Azure subscription and a Storage Account to use this package.
To create a new Storage Account, you can use the Azure Portal, Azure PowerShell, or the Azure CLI. Here's an example using the Azure CLI:
az storage account create --name MyStorageAccount --resource-group MyResourceGroup --location westus --sku Standard_LRS
Key concepts
DataLake Storage Gen2 was designed to:
- Service multiple petabytes of information while sustaining hundreds of gigabits of throughput
- Allow you to easily manage massive amounts of data
Key Features of DataLake Storage Gen2 include:
- Hadoop compatible access
- A superset of POSIX permissions
- Cost effective in terms of low-cost storage capacity and transactions
- Optimized driver for big data analytics
A fundamental part of Data Lake Storage Gen2 is the addition of a hierarchical namespace to Blob storage. The hierarchical namespace organizes objects/files into a hierarchy of directories for efficient data access.
In the past, cloud-based analytics had to compromise in areas of performance, management, and security. Data Lake Storage Gen2 addresses each of these aspects in the following ways:
- Performance is optimized because you do not need to copy or transform data as a prerequisite for analysis. The hierarchical namespace greatly improves the performance of directory management operations, which improves overall job performance.
- Management is easier because you can organize and manipulate files through directories and subdirectories.
- Security is enforceable because you can define POSIX permissions on directories or individual files.
- Cost effectiveness is made possible as Data Lake Storage Gen2 is built on top of the low-cost Azure Blob storage. The additional features further lower the total cost of ownership for running big data analytics on Azure.
Data Lake Storage Gen2 offers two types of resources:
- The filesystem used via 'DataLakeFileSystemClient'
- The path used via 'DataLakeFileClient' or 'DataLakeDirectoryClient'
ADLS Gen2 | Blob |
---|---|
Filesystem | Container |
Path (File or Directory) | Blob |
Note: This client library does not support hierarchical namespace (HNS) disabled storage accounts.
Thread safety
We guarantee that all client instance methods are thread-safe and independent of each other (guideline). This ensures that the recommendation of reusing client instances is always safe, even across threads.
Additional concepts
Client options | Accessing the response | Long-running operations | Handling failures | Diagnostics | Mocking | Client lifetime
Examples
Create a DataLakeServiceClient
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey);
// Create DataLakeServiceClient using StorageSharedKeyCredentials
DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential);
Create a DataLakeFileSystemClient
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey);
// Create DataLakeServiceClient using StorageSharedKeyCredentials
DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential);
// Create a DataLake Filesystem
DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient("sample-filesystem");
filesystem.Create();
Create a DataLakeDirectoryClient
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey);
// Create DataLakeServiceClient using StorageSharedKeyCredentials
DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential);
// Get a reference to a filesystem named "sample-filesystem-append" and then create it
DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient("sample-filesystem-append");
filesystem.Create();
// Create
DataLakeDirectoryClient directory = filesystem.GetDirectoryClient("sample-file");
directory.Create();
Create a DataLakeFileClient
Create DataLakeFileClient from a DataLakeDirectoryClient
// Create a DataLake Directory
DataLakeDirectoryClient directory = filesystem.CreateDirectory("sample-directory");
directory.Create();
// Create a DataLake File using a DataLake Directory
DataLakeFileClient file = directory.GetFileClient("sample-file");
file.Create();
Create DataLakeFileClient from a DataLakeFileSystemClient
// Create a DataLake Filesystem
DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient("sample-filesystem");
filesystem.Create();
// Create a DataLake file using a DataLake Filesystem
DataLakeFileClient file = filesystem.GetFileClient("sample-file");
file.Create();
Appending Data to a DataLake File
// Create a file
DataLakeFileClient file = filesystem.GetFileClient("sample-file");
file.Create();
// Append data to the DataLake File
file.Append(File.OpenRead(sampleFilePath), 0);
file.Flush(SampleFileContent.Length);
Reading Data from a DataLake File
Response<FileDownloadInfo> fileContents = file.Read();
Reading Streaming Data from a DataLake File
Response<DataLakeFileReadStreamingResult> fileContents = file.ReadStreaming();
Stream readStream = fileContents.Value.Content;
Reading Content Data from a DataLake File
Response<DataLakeFileReadResult> fileContents = file.ReadContent();
BinaryData readData = fileContents.Value.Content;
Listing/Traversing through a DataLake Filesystem
foreach (PathItem pathItem in filesystem.GetPaths())
{
names.Add(pathItem.Name);
}
Set Permissions on a DataLake File
// Create a DataLake file so we can set the Access Controls on the files
DataLakeFileClient fileClient = filesystem.GetFileClient("sample-file");
fileClient.Create();
// Set the Permissions of the file
PathPermissions pathPermissions = PathPermissions.ParseSymbolicPermissions("rwxrwxrwx");
fileClient.SetPermissions(permissions: pathPermissions);
Set Access Controls (ACLs) on a DataLake File
// Create a DataLake file so we can set the Access Controls on the files
DataLakeFileClient fileClient = filesystem.GetFileClient("sample-file");
fileClient.Create();
// Set Access Control List
IList<PathAccessControlItem> accessControlList
= PathAccessControlExtensions.ParseAccessControlList("user::rwx,group::r--,mask::rwx,other::---");
fileClient.SetAccessControlList(accessControlList);
Get Access Controls (ACLs) on a DataLake File
// Get Access Control List
PathAccessControl accessControlResponse = fileClient.GetAccessControl();
Rename a DataLake File
DataLakeFileClient renamedFileClient = fileClient.Rename("sample-file2");
Rename a DataLake Directory
DataLakeDirectoryClient renamedDirectoryClient = directoryClient.Rename("sample-directory2");
Get Properties on a DataLake File
// Get Properties on a File
PathProperties filePathProperties = fileClient.GetProperties();
Get Properties on a DataLake Directory
// Get Properties on a Directory
PathProperties directoryPathProperties = directoryClient.GetProperties();
Troubleshooting
All File DataLake service operations will throw a
RequestFailedException on failure with
helpful ErrorCode
s. Many of these errors are recoverable.
Next steps
Get started with our DataLake samples:
- Hello World: Append, Read, and List DataLake Files (or asynchronously)
- Auth: Authenticate with public access, shared keys, shared access signatures, and Azure Active Directory.
Contributing
See the Storage CONTRIBUTING.md for details on building, testing, and contributing to this library.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit cla.microsoft.com.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Azure.Storage.Blobs (>= 12.23.0)
- Azure.Storage.Common (>= 12.22.0)
- System.Text.Json (>= 6.0.10)
-
net6.0
- Azure.Storage.Blobs (>= 12.23.0)
- Azure.Storage.Common (>= 12.22.0)
- System.Text.Json (>= 6.0.10)
NuGet packages (25)
Showing the top 5 NuGet packages that depend on Azure.Storage.Files.DataLake:
Package | Downloads |
---|---|
Service.Extensions.Azure.DataLake
Extensions to provide consistent configurations and patterns for your service. |
|
Microsoft.DataPrep
Microsoft Azure Machine Learning Data Preparation SDK. |
|
Relativity.Transfer.SDK
Relativity Transfer SDK allows performing high-throughput transfers of files from and to Relativity environment. |
|
Auditflo.Utility
Package Description |
|
Traffk.StorageProviders.Providers.AzureBlobStorageProvider
Provider implementation for BLOB storage on classic + Azure Data Lake Storage Gen2 stores. NOT for ADLSG1. |
GitHub repositories (1)
Showing the top 1 popular GitHub repositories that depend on Azure.Storage.Files.DataLake:
Repository | Stars |
---|---|
Azure/azure-powershell
Microsoft Azure PowerShell
|
Version | Downloads | Last updated |
---|---|---|
12.21.0 | 54,838 | 11/12/2024 |
12.21.0-beta.2 | 421 | 10/10/2024 |
12.21.0-beta.1 | 101 | 10/8/2024 |
12.20.1 | 107,407 | 10/11/2024 |
12.20.0 | 71,171 | 9/19/2024 |
12.20.0-beta.1 | 1,846 | 8/7/2024 |
12.19.1 | 215,400 | 7/25/2024 |
12.19.0 | 32,656 | 7/16/2024 |
12.19.0-beta.1 | 712 | 6/11/2024 |
12.18.0 | 398,444 | 5/14/2024 |
12.18.0-beta.2 | 706 | 4/16/2024 |
12.18.0-beta.1 | 10,202 | 12/5/2023 |
12.17.1 | 921,238 | 11/14/2023 |
12.17.0 | 18,207 | 11/6/2023 |
12.17.0-beta.1 | 1,068 | 10/16/2023 |
12.16.0 | 257,183 | 9/12/2023 |
12.16.0-beta.1 | 12,213 | 8/8/2023 |
12.15.0 | 236,078 | 7/11/2023 |
12.15.0-beta.1 | 894 | 5/30/2023 |
12.14.0 | 398,299 | 4/11/2023 |
12.14.0-beta.1 | 418 | 3/28/2023 |
12.13.1 | 75,826 | 3/24/2023 |
12.13.0 | 170,564 | 2/22/2023 |
12.13.0-beta.1 | 1,093 | 2/8/2023 |
12.12.1 | 741,691 | 10/13/2022 |
12.12.0 | 30,329 | 10/12/2022 |
12.12.0-beta.1 | 2,415 | 8/23/2022 |
12.11.0 | 868,892 | 7/8/2022 |
12.11.0-beta.1 | 487 | 6/15/2022 |
12.10.0 | 312,132 | 5/2/2022 |
12.10.0-beta.1 | 639 | 4/12/2022 |
12.9.0 | 392,346 | 3/10/2022 |
12.9.0-beta.3 | 732 | 2/7/2022 |
12.9.0-beta.2 | 3,720 | 11/30/2021 |
12.9.0-beta.1 | 1,596 | 11/4/2021 |
12.8.0 | 773,524 | 9/9/2021 |
12.8.0-beta.2 | 1,725 | 7/23/2021 |
12.8.0-beta.1 | 243 | 7/23/2021 |
12.7.0 | 329,441 | 6/9/2021 |
12.7.0-beta.4 | 490 | 5/12/2021 |
12.7.0-beta.3 | 1,368 | 4/9/2021 |
12.7.0-beta.2 | 517 | 3/10/2021 |
12.7.0-beta.1 | 5,826 | 2/10/2021 |
12.6.2 | 52,012 | 5/21/2021 |
12.6.1 | 195,703 | 3/29/2021 |
12.6.0 | 363,447 | 1/12/2021 |
12.6.0-beta.1 | 999 | 12/7/2020 |
12.5.0 | 261,517 | 11/10/2020 |
12.5.0-preview.1 | 20,022 | 10/1/2020 |
12.4.0 | 975,505 | 8/31/2020 |
12.3.1 | 55,414 | 8/18/2020 |
12.3.0-preview.2 | 15,464 | 7/28/2020 |
12.3.0-preview.1 | 6,398 | 7/3/2020 |
12.2.2 | 331,132 | 6/5/2020 |
12.2.1 | 64,345 | 6/2/2020 |
12.2.0 | 41,698 | 5/6/2020 |
12.1.0 | 80,468 | 4/6/2020 |
12.0.0 | 84,215 | 3/12/2020 |
12.0.0-preview.9 | 12,142 | 2/11/2020 |
12.0.0-preview.8 | 26,728 | 1/10/2020 |
12.0.0-preview.7 | 4,920 | 12/4/2019 |
12.0.0-preview.6 | 5,298 | 11/7/2019 |
12.0.0-preview.5 | 1,649 | 11/6/2019 |