我的JSON Blob文件大小约为212 MB。在本地调试时,大约需要15分钟才能下载。当我将代码部署到Azure应用服务时,它将运行10分钟并失败,并显示以下错误:(在本地它会间歇性地失败,并显示相同的错误)
服务器无法验证请求。确保值授权标头正确形成,包括签名
代码尝试1:
// Create SAS Token for referencing a file for a duration of 5 min
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy
{
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15),
Permissions = SharedAccessBlobPermissions.Read
};
var blob = cloudBlobContainer.GetBlockBlobReference(blobFilePath);
string sasContainerToken = blob.GetSharedAccessSignature(sasConstraints);
var cloudBlockBlob = new CloudBlockBlob(new Uri(blob.Uri + sasContainerToken));
using (var stream = new MemoryStream())
{
await cloudBlockBlob.DownloadToStreamAsync(stream);
//resetting stream's position to 0
stream.Position = 0;
var serializer = new JsonSerializer();
using (var sr = new StreamReader(stream))
{
using (var jsonTextReader = new JsonTextReader(sr))
{
jsonTextReader.SupportMultipleContent = true;
result = new List<T>();
while (jsonTextReader.Read())
{
result.Add(serializer.Deserialize<T>(jsonTextReader));
}
}
}
}
代码尝试2:我尝试使用DownloadRangeToStreamAsync来下载块中的Blob,但没有任何改变:
int bufferLength = 1 * 1024 * 1024;//1 MB chunk
long blobRemainingLength = blob.Properties.Length;
Queue<KeyValuePair<long, long>> queues = new Queue<KeyValuePair<long, long>>();
long offset = 0;
do
{
long chunkLength = (long)Math.Min(bufferLength, blobRemainingLength);
offset += chunkLength;
blobRemainingLength -= chunkLength;
using (var ms = new MemoryStream())
{
await blob.DownloadRangeToStreamAsync(ms, offset, chunkLength);
ms.Position = 0;
lock (outPutStream)
{
outPutStream.Position = offset;
var bytes = ms.ToArray();
outPutStream.Write(bytes, 0, bytes.Length);
}
}
}
while (blobRemainingLength > 0);
我认为212 MB数据不是一个很大的JSON文件。你能建议一个吗解决方案?
我建议您可以使用Azure Storage Data Movement Library尝试一下。
我使用220MB的较大文件进行了测试,将它下载到内存大约需要5分钟。
示例代码:
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy
{
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15),
Permissions = SharedAccessBlobPermissions.Read
};
CloudBlockBlob blob = blobContainer.GetBlockBlobReference("t100.txt");
string sasContainerToken = blob.GetSharedAccessSignature(sasConstraints);
var cloudBlockBlob = new CloudBlockBlob(new Uri(blob.Uri + sasContainerToken));
var stream = new MemoryStream();
//set this value as per your need
TransferManager.Configurations.ParallelOperations = 5;
Console.WriteLine("begin to download...");
//use Stopwatch to calculate the time
Stopwatch stopwatch = new Stopwatch();
stopwatch.Start();
DownloadOptions options = new DownloadOptions();
options.DisableContentMD5Validation = true;
//use these lines of code just for checking the downloading progress, you can remove it in your code.
SingleTransferContext context = new SingleTransferContext();
context.ProgressHandler = new Progress<TransferStatus>((progress) =>
{
Console.WriteLine("Bytes downloaded: {0}", progress.BytesTransferred);
});
var task = TransferManager.DownloadAsync(cloudBlockBlob, stream,options,context);
task.Wait();
stopwatch.Stop();
Console.WriteLine("the length of the stream is: "+stream.Length);
Console.WriteLine("the time is taken: "+stopwatch.ElapsedMilliseconds);
测试结果: