windows 如何在不编写自己的程序的情况下将某些文件上传到 Azure blob 存储?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/6584036/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How do I upload some file into Azure blob storage without writing my own program?
提问by sharptooth
I created an Azure Storage account. I have a 400 megabytes .zip file that I want to put into blob storage for later use.
我创建了一个 Azure 存储帐户。我有一个 400 兆字节的 .zip 文件,我想将其放入 blob 存储以备后用。
How can I do that without writing code? Is there some interface for that?
我怎么能不写代码呢?有什么接口吗?
回答by Stephen Chung
Free tools:
免费工具:
- Visual Studio 2010 -- install Azure tools and you can find the blobs in the Server Explorer
- Cloud Berry Lab's CloudBerry Explorer for Azure Blob Storage
- ClumpsyLeaf CloudXplorer
- Azure Storage Explorer from CodePlex (try version 4 beta)
- Visual Studio 2010 -- 安装 Azure 工具,您可以在服务器资源管理器中找到 blob
- Cloud Berry Lab 的 CloudBerry Explorer 用于 Azure Blob 存储
- ClumpsyLeaf CloudXplorer
- CodePlex 中的 Azure 存储资源管理器(尝试版本 4 测试版)
There was an old program called Azure Blob Explorer or something that no longer works with the new Azure SDK.
有一个名为 Azure Blob Explorer 的旧程序或一些不再适用于新 Azure SDK 的程序。
Out of these, I personally like CloudBerry Explorer the best.
其中,我个人最喜欢 CloudBerry Explorer。
回答by Yao
The easiest way is to use Azure Storage PowerShell. It provided many commands to manage your storage container/blob/table/queue.
最简单的方法是使用 Azure 存储 PowerShell。它提供了许多命令来管理您的存储容器/blob/表/队列。
For your mentioned case, you could use Set-AzureStorageBlobContentwhich could upload a local file into azure storage as a block blob or page blob.
对于您提到的情况,您可以使用Set-AzureStorageBlobContent,它可以将本地文件作为块 blob 或页面 blob 上传到 azure 存储中。
Set-AzureStorageBlobContent -Container containerName -File .\filename -Blob blobname
For details, please refer to http://msdn.microsoft.com/en-us/library/dn408487.aspx.
有关详细信息,请参阅http://msdn.microsoft.com/en-us/library/dn408487.aspx。
回答by Gaurav Mantri-AIS
If you're looking for a tool to do so, may I suggest that you take a look at our tool Cloud Storage Studio (http://www.cerebrata.com/Products/CloudStorageStudio). It's a commercial tool for managing Windows Azure Storage and Hosted Service. You can also find a comprehensive list of Windows Azure Storage Management tools here: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx
如果您正在寻找这样做的工具,我建议您看看我们的工具 Cloud Storage Studio ( http://www.cerebrata.com/Products/CloudStorageStudio)。它是用于管理 Windows Azure 存储和托管服务的商业工具。您还可以在此处找到 Windows Azure 存储管理工具的完整列表:http: //blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx
Hope this helps.
希望这可以帮助。
回答by dunnry
The StorageClient has this built into it. No need to write really anything:
StorageClient 内置了这个。不需要写任何东西:
var account = new CloudStorageAccount(creds, false);
var client = account.CreateCloudBlobClient();
var blob = client.GetBlobReference("/somecontainer/hugefile.zip");
//1MB seems to be a pretty good all purpose size
client.WriteBlockSizeInBytes = 1024;
//this sets # of parallel uploads for blocks
client.ParallelOperationThreadCount = 4; //normally set to one per CPU core
//this will break blobs up automatically after this size
client.SingleBlobUploadThresholdInBytes = 4096;
blob.UploadFile("somehugefile.zip");
回答by ntheitroad
I use Cyberduckto manage my blob storage.
我使用Cyberduck来管理我的 Blob 存储。
It is free and very easy to use. It works with other cloud storage solutions as well.
它是免费的并且非常容易使用。它也适用于其他云存储解决方案。
I recently found this one as well: CloudXplorer
我最近也发现了这个:CloudXplorer
Hope it helps.
希望能帮助到你。
回答by Ivan Ignatiev
There is a new OpenSource tool provided by Microsoft :
微软提供了一个新的开源工具:
- Project Deco - Crossplatform Microsoft Azure Storage Account Explorer.
- Project Deco - 跨平台 Microsoft Azure 存储帐户资源管理器。
Please, check those links:
请检查这些链接:
- Download binaries: http://storageexplorer.com/
- Source Code: https://github.com/Azure/deco
- 下载二进制文件:http: //storageexplorer.com/
- 源代码:https: //github.com/Azure/deco
回答by David Yee
A simple batch file using Microsoft's AzCopy
utility will do the trick. You can drag-and-drop your files on the following batch file to upload into your blob storage container:
使用 MicrosoftAzCopy
实用程序的简单批处理文件即可解决问题。您可以将文件拖放到以下批处理文件中以上传到 Blob 存储容器:
upload.bat
上传.bat
@ECHO OFF
SET BLOB_URL=https://<<<account name>>>.blob.core.windows.net/<<<container name>>>
SET BLOB_KEY=<<<your access key>>>
:AGAIN
IF "%~1" == "" GOTO DONE
AzCopy /Source:"%~d1%~p1" /Dest:%BLOB_URL% /DestKey:%BLOB_KEY% /Pattern:"%~n1%~x1" /destType:blob
SHIFT
GOTO AGAIN
:DONE
PAUSE
Note that the above technique only uploads one or more files individually (since the Pattern
flag is specified) instead of uploading an entire directory.
请注意,上述技术仅单独上传一个或多个文件(因为Pattern
指定了标志),而不是上传整个目录。
回答by ezolotko
You can use Cloud Combinefor reliable and quick file upload to Azure blob storage.
可以使用Cloud Combine将文件可靠且快速地上传到 Azure blob 存储。
回答by ezolotko
You can upload large files directly to the Azure Blob Storage directly using the HTTP PUT verb, the biggest file I have tried with the code below is 4,6 Gb. You can do this in C# like this:
你可以使用 HTTP PUT 动词直接将大文件直接上传到 Azure Blob 存储,我用下面的代码尝试过的最大文件是 4,6 Gb。你可以在 C# 中这样做:
// write up to ChunkSize of data to the web request
void WriteToStreamCallback(IAsyncResult asynchronousResult)
{
var webRequest = (HttpWebRequest)asynchronousResult.AsyncState;
var requestStream = webRequest.EndGetRequestStream(asynchronousResult);
var buffer = new Byte[4096];
int bytesRead;
var tempTotal = 0;
File.FileStream.Position = DataSent;
while ((bytesRead = File.FileStream.Read(buffer, 0, buffer.Length)) != 0
&& tempTotal + bytesRead < CHUNK_SIZE
&& !File.IsDeleted
&& File.State != Constants.FileStates.Error)
{
requestStream.Write(buffer, 0, bytesRead);
requestStream.Flush();
DataSent += bytesRead;
tempTotal += bytesRead;
File.UiDispatcher.BeginInvoke(OnProgressChanged);
}
requestStream.Close();
if (!AbortRequested) webRequest.BeginGetResponse(ReadHttpResponseCallback, webRequest);
}
void StartUpload()
{
var uriBuilder = new UriBuilder(UploadUrl);
if (UseBlocks)
{
// encode the block name and add it to the query string
CurrentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString()));
uriBuilder.Query = uriBuilder.Query.TrimStart('?') + string.Format("&comp=block&blockid={0}", CurrentBlockId);
}
// with or without using blocks, we'll make a PUT request with the data
var webRequest = (HttpWebRequest)WebRequestCreator.ClientHttp.Create(uriBuilder.Uri);
webRequest.Method = "PUT";
webRequest.BeginGetRequestStream(WriteToStreamCallback, webRequest);
}
The UploadUrl is generated by Azure itself and contains a Shared Access Signature, this SAS URL says where the blob is to be uploaded to and how long time the security access (write access in your case) is given. You can generate a SAS URL like this:
UploadUrl 由 Azure 本身生成并包含共享访问签名,此 SAS URL 说明要上传 blob 的位置以及提供安全访问(在您的情况下为写访问)的时间。您可以像这样生成 SAS URL:
readonly CloudBlobClient BlobClient;
readonly CloudBlobContainer BlobContainer;
public UploadService()
{
// Setup the connection to Windows Azure Storage
var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
BlobClient = storageAccount.CreateCloudBlobClient();
// Get and create the container
BlobContainer = BlobClient.GetContainerReference("publicfiles");
}
string JsonSerializeData(string url)
{
var serializer = new DataContractJsonSerializer(url.GetType());
var memoryStream = new MemoryStream();
serializer.WriteObject(memoryStream, url);
return Encoding.Default.GetString(memoryStream.ToArray());
}
public string GetUploadUrl()
{
var sasWithIdentifier = BlobContainer.GetSharedAccessSignature(new SharedAccessPolicy
{
Permissions = SharedAccessPermissions.Write,
SharedAccessExpiryTime =
DateTime.UtcNow.AddMinutes(60)
});
return JsonSerializeData(BlobContainer.Uri.AbsoluteUri + "/" + Guid.NewGuid() + sasWithIdentifier);
}
I also have a thread on the subject where you can find more information here How to upload huge files to the Azure blob from a web page
我还有一个关于这个主题的主题,你可以在这里找到更多信息如何从网页上传大文件到 Azure blob
回答by Sarat Chandra
You can upload files to Azure Storage Account Blob using Command Prompt.
可以使用命令提示符将文件上传到 Azure 存储帐户 Blob 。
Install Microsoft Azure Storage tools.
安装Microsoft Azure 存储工具。
And then Upload it to your account blob will CLI command:
然后将其上传到您的帐户 blob 将 CLI 命令:
AzCopy /Source:"filepath" /Dest:bloburl /DestKey:accesskey /destType:blob
Hope it Helps.. :)
希望能帮助到你.. :)