C#使用“UploadFromStream”方法将文件从MongoDB迁移到Azure Blob存储无法正常工作

问题描述 投票:0回答:1

我在从MongoDB 2 Azure blob存储迁移文件时遇到问题。

下一个方法获取一个GridFSFile对象(表示MongoDB GridFSFileStorage中的文件),然后调用uploadMemoryStream方法进行上传。

值得一提的是,gridFSFile确实在findById之后有内容,而且dows也有内容,并且该位置最初为0。

gridFSFile.Open方法创建一个Stream对象,然后将其作为参数传递给上载。

private static void iterateOverVersionCollection(Version version, Asset asset)
{
    try
    {    
        string _gridFSId = version.GridFSId;
        GridFSFile gridFSFile = gridFSFileStorage.FindById(_gridFSId);
        if (gridFSFile == null) return;

        string size = version.Name.ToLower();
        asset.Size = size;
        CloudBlockBlob blockBlob = GetBlockBlobReference(version, gridFSFile, asset);
        uploadMemoryStream(blockBlob, gridFSFile, asset);
        asset.UploadedOK = true;
    }
    catch (StorageException ex)
    {
        asset.UploadedOK = false;
        logException(ex, asset);
    }
}

private static void uploadMemoryStream(CloudBlockBlob blockBlob, GridFSFile gridFSFile, Asset asset)
{
      Stream st = gridFSFile.Open();
      blockBlob.UploadFromStream(st);
}

UploadFromStream需要永远,永远不会上传,有一点需要提及的是,无论我如何使用gridFSFile,如果我尝试使用Stream.copyTo c#方法创建一个MemoryStream,它也将永远带来永无止境的应用程序被阻塞在blockBlob.UploadFromStream(st);

我没有将gridFSFile.Open传递给UploadFromMemoryStream,而是尝试了下一段代码:

using (var stream = new MemoryStream())
{
    byte[] buffer = new byte[2048]; // read in chunks of 2KB
    int bytesRead;
    while((bytesRead = st.Read(buffer, 0, buffer.Length)) > 0)
    {
        stream.Write(buffer, 0, bytesRead);
    }
    byte[] result = stream.ToArray();
}

但同样的,程序卡在st.Read行中。

任何帮助都感激不尽。

c# mongodb azure azure-storage-blobs azure-blob-storage
1个回答
0
投票

请注意,由于UploadFromFileAsync()或UploadFromStream对于大型blob来说不是一个可靠而有效的操作,我建议您考虑以下备选方案:

如果您可以接受命令行工具,则可以尝试使用AzCopy,它能够以高性能传输Azure存储数据,并且可以暂停和恢复其传输。

如果要以编程方式控制传输作业,请使用Azure Storage Data Movement Library,它是AzCopy.Sample代码的核心。

string storageConnectionString = "myStorageConnectionString";
     CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
      CloudBlobClient blobClient = account.CreateCloudBlobClient();
      CloudBlobContainer blobContainer = blobClient.GetContainerReference("mycontainer");
      blobContainer.CreateIfNotExistsAsync().Wait();
      string sourcePath = @"C:\Tom\TestLargeFile.zip";
      CloudBlockBlob destBlob = blobContainer.GetBlockBlobReference("LargeFile.zip");
      // Setup the number of the concurrent operations
      TransferManager.Configurations.ParallelOperations = 64;
      // Setup the transfer context and track the upoload progress
      var context = new SingleTransferContext
      {
          ProgressHandler =
          new Progress<TransferStatus>(
               progress => { Console.WriteLine("Bytes uploaded: {0}", progress.BytesTransferred); })
       };
      // Upload a local blob
      TransferManager.UploadAsync(sourcePath, destBlob, null, context, CancellationToken.None).Wait();
      Console.WriteLine("Upload finished !");
      Console.ReadKey();

如果您仍在寻找从流中以编程方式上传文件,我建议您使用下面的代码以块的形式上传文件

var container = _client.GetContainerReference("test");
container.CreateIfNotExists();
var blob = container.GetBlockBlobReference(file.FileName);
var blockDataList = new Dictionary<string, byte[]>();
using (var stream = file.InputStream)
{
    var blockSizeInKB = 1024;
    var offset = 0;
    var index = 0;
    while (offset < stream.Length)
    {
        var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset);
        var blockData = new byte[readLength];
        offset += stream.Read(blockData, 0, readLength);
        blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData);

        index++;
    }
}

Parallel.ForEach(blockDataList, (bi) =>
{
    blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null);
});
blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray());

另一方面,如果您的系统中有文件可用且想要使用Uploadfile方法,我们也可以灵活地使用此方法上传文件数据

TimeSpan backOffPeriod = TimeSpan.FromSeconds(2);
int retryCount = 1;
BlobRequestOptions bro = new BlobRequestOptions()
{
  SingleBlobUploadThresholdInBytes = 1024 * 1024, //1MB, the minimum
  ParallelOperationThreadCount = 1, 
  RetryPolicy = new ExponentialRetry(backOffPeriod, retryCount),
};
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(ConnectionString);
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
cloudBlobClient.DefaultRequestOptions = bro;
cloudBlobContainer = cloudBlobClient.GetContainerReference(ContainerName);
CloudBlockBlob blob = cloudBlobContainer.GetBlockBlobReference(Path.GetFileName(fileName));
blob.StreamWriteSizeInBytes = 256 * 1024; //256 k
blob.UploadFromFile(fileName, FileMode.Open);

有关详细说明,请浏览

https://www.red-gate.com/simple-talk/cloud/platform-as-a-service/azure-blob-storage-part-4-uploading-large-blobs/

希望能帮助到你。

© www.soinside.com 2019 - 2024. All rights reserved.