首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >Amazon Glacier并行档案上传

Amazon Glacier并行档案上传
EN

Stack Overflow用户
提问于 2013-04-18 21:56:01
回答 2查看 699关注 0票数 1

我有一个样本应用程序,上传一个大小为26MB的文件到亚马逊冰川使用AWS SDK的.NET高级应用程序接口。代码在没有线程的情况下工作得很好,但是在线程池中,它在下面的代码行失败了

代码语言:javascript
复制
         client.UploadMultipartPart(uploadMPUrequest);

显示错误消息:请求已中止:请求已取消。

堆栈跟踪: at Amazon.Runtime.AmazonWebServiceClient.handleHttpWebErrorResponse(AsyncResult asyncResult,WebException we) at Amazon.Runtime.AmazonWebServiceClient.InvokeConfiguredRequest(AsyncResult Amazon.Runtime.AmazonWebServiceClient.handleHttpWebErrorResponse(AsyncResult) at Amazon.Runtime.AmazonWebServiceClient.InvokeHelper(AsyncResult Amazon.Runtime.AmazonWebServiceClient.Invoke(AsyncResult ) at Amazon.Glacier.AmazonGlacierClient.invokeUploadMultipartPart(UploadMultipartPartRequest asyncResult) at asyncResult uploadMultipartPartRequest,AsyncCallback回调,对象状态,Boolean同步) at Amazon.Glacier.AmazonGlacierClient.UploadMultipartPart(UploadMultipartPartRequest asyncResult)

注意:我是分块上传数据的

请找到我的示例代码的以下链接:www.page-monitor.com/Downloads/ArchiveUploadMPU.cs

有没有并行上传归档的示例代码?

谢谢和问候,Haseena

EN

回答 2

Stack Overflow用户

回答已采纳

发布于 2013-10-03 13:29:52

下面是可以很好地处理线程的示例代码。ChunkDetails是一个自定义的库,用于传递accessID、bucketname、偏移量详细信息等。我也在使用ThrottledStream。

代码语言:javascript
复制
       internal bool UploadUsingHighLevelAPI(String FilePath, ChunkDetails ObjMetaData,
                                            S3Operations.UploadType uploadType,
       Stream inputStream)
       {
       String METHOD_NAME = "UploadUsingHighLevelAPI";
        String keyName;
        String existingBucketName;
        TransferUtilityUploadRequest fileTransferUtilityRequest = null;
        int RetryTimes = 3;
        ThrottledStream throttleStreamObj = null;

        long bps = ThrottledStream.Infinite;
        try
        {


            keyName = ObjMetaData.KeyName;
            existingBucketName = ObjMetaData.BucketName;

            TransferUtility fileTransferUtility = new
                    TransferUtility(ObjMetaData.AccessKeyID,        ObjMetaData.SecretAccessKey);

            FileInfo fin = new FileInfo(FilePath);

            //streamObj = new FileStream(FilePath, FileMode.Open);

            bps = (long)(1024 * ObjMetaData.MaxAvailSpeed * ((double)ObjMetaData.Bandwidth / 100.0));


            throttleStreamObj = new ThrottledStream(ObjMetaData.FileStream, bps);


            System.Collections.Specialized.NameValueCollection metaInfo = new System.Collections.Specialized.NameValueCollection();
            if (ObjMetaData.MetaInfo != null)
            {
                foreach (DictionaryEntry kvp in ObjMetaData.MetaInfo)
                {
                    metaInfo.Add(kvp.Key.ToString(), kvp.Value.ToString());
                }
            }


            long OffDiff = ObjMetaData.EndOffset - ObjMetaData.StartOffset;
            long partSize;
            if (fin.Length >= OffDiff)
            {
                partSize = OffDiff;
            }
            else
                partSize = fin.Length;



            if (uploadType == UploadType.File)
            {
                //fileTransferUtility.Upload(FilePath, existingBucketName, keyName);


                fileTransferUtilityRequest =
                new TransferUtilityUploadRequest()
                .WithBucketName(existingBucketName)
                //.WithFilePath(FilePath)
                .WithStorageClass(S3StorageClass.ReducedRedundancy)
                .WithMetadata(metaInfo)
                .WithPartSize(partSize)
                .WithKey(keyName)
                .WithCannedACL(S3CannedACL.PublicRead)
                .WithTimeout(Int32.MaxValue - 1)
                .WithInputStream(throttleStreamObj) as TransferUtilityUploadRequest;

            }
            else if (uploadType == UploadType.Stream)
            {
                fileTransferUtilityRequest =
               new TransferUtilityUploadRequest()
               .WithBucketName(existingBucketName)
               .WithStorageClass(S3StorageClass.ReducedRedundancy)
               .WithMetadata(metaInfo)
               .WithPartSize(partSize)
               .WithKey(keyName)
               .WithCannedACL(S3CannedACL.PublicRead)
               .WithTimeout(Int32.MaxValue - 1)
               .WithInputStream(throttleStreamObj) as TransferUtilityUploadRequest
               ;
            }



            for (int index = 1; index <= RetryTimes; index++)
            {
                try
                {

                    // Upload part and add response to our list.
                    fileTransferUtility.Upload(fileTransferUtilityRequest);
                    Console.WriteLine(" ====== Upload Done =========");
                    if (eventChunkUploaded != null)
                        eventChunkUploaded(ObjMetaData);
                    break;

                }
                catch (Exception ex)
                {
                    if (index == RetryTimes)
                    {
                        m_objLogFile.LogError(CLASS_NAME, METHOD_NAME + " - Attempt " +
                            index + Environment.NewLine + FilePath, ex);

                        if (eventChunkUploadError != null)
                            eventChunkUploadError(ObjMetaData, ex.Message);

                    }
                }
            }
        }
        catch (Exception ex)
        {
            m_objLogFile.LogError(CLASS_NAME, METHOD_NAME, ex);
            return false;
        }
        finally
        {

            if (throttleStreamObj != null)
            {
                //inputStream1.Close();
                throttleStreamObj = null;
            }
        }

        return true;
   }

如果你遇到任何问题,请告诉我。

票数 0
EN

Stack Overflow用户

发布于 2013-09-30 23:47:06

我相信代码中存在竞争条件。我正在研究同样的功能。我很乐意与你分享代码。如果你修复了你发布的代码,我将非常感谢你的链接。致以最好的问候,布鲁斯

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/16084954

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档