"Excess found in a non pipelined read" 使用 Curl 发布 200MB 文件时

"Excess found in a non pipelined read" when posting 200MB file with Curl

我正在尝试将以下 post 请求发送到 Autodesk API 存储桶端点的 200MB 附件:

Douglass-MBP-2:Desktop douglasduhaime$ curl -v 'https://developer.api.autodesk.com/oss/v2/buckets/secret-bucket/objects/200mbfile.nwd' -X 'PUT' -H 'Authorization: Bearer myOauthCredentials' -H 'Content-Type: application/octet-stream' -H 'Content-Length: 308331' -T '200mbfile.nwd'

此请求产生以下响应:

*   Trying 52.7.124.118...
* Connected to developer.api.autodesk.com (52.7.124.118) port 443 (#0)
* TLS 1.2 connection using TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
* Server certificate: developer.api.autodesk.com
* Server certificate: Symantec Class 3 Extended Validation SHA256 SSL CA
* Server certificate: VeriSign Universal Root Certification Authority
> PUT /oss/v2/buckets/secret-bucket/objects/200mbfile.nwd HTTP/1.1
> Host: developer.api.autodesk.com
> User-Agent: curl/7.43.0
> Accept: */*
> Authorization: Bearer MyOauthCredentials
> Content-Type: application/octet-stream
> Content-Length: 308331
> Expect: 100-continue
>
< HTTP/1.1 100 Continue
< HTTP/1.1 200 OK
< Access-Control-Allow-Credentials: true
< Access-Control-Allow-Headers: Authorization, Accept-Encoding, Range, Content-Type
< Access-Control-Allow-Methods: GET
< Access-Control-Allow-Origin: *
< Content-Type: application/json; charset=utf-8
< Date: Fri, 09 Sep 2016 15:36:51 GMT
< Server: Apigee Router
< Content-Length: 467
< Connection: keep-alive
<
* Excess found in a non pipelined read: excess = 66, size = 467, maxdownload = 467, bytecount = 0

有谁知道我怎样才能继续发送完整的数据包?如果其他人可以提供任何建议,我将不胜感激!

没多久我就明白了——我只需要从 POST 请求中删除 -H 'Content-Length: 308331' header(我从他们的教程中复制了内容长度,并且内容长度小于我发送的数据包,因此是继续消息)。

对于大型上传,最好使用/resumable endpoint. An example of use is available from that sample,也粘贴在下面:

program
.command ('resumable')
.description ('upload a file in multiple pieces (i.e. resumables)')
.arguments ('<file> <pieces>')
.action (function (file, pieces) {
    pieces =pieces || 2 ;
    var bucketKey =readBucketKey () ;
    if ( !checkBucketKey (bucketKey) )
        return ;
    var fileKey =makeKey (file) ;
    fs.stat (file, function (err, stats) {
        if ( err )
            return (console.log (error.message)) ;
        var size =stats.size ;
        var pieceSz =parseInt (size / pieces) ;
        var modSz =size % pieces ;
        if ( modSz )
            pieces++ ;
        console.log ('Uploading file: ' + file + ' in ' + pieces + ' pieces') ;
        var piecesMap =Array.apply (null, { length: pieces }).map (Number.call, Number) ;
        var sessionId =Math.random ().toString (36).replace (/[^a-z]+/g, '').substr (0, 12) ;
        async.eachLimit (piecesMap, 1,
            function (i, callback) {
                var start =i * pieceSz ;
                var end =Math.min (size, (i + 1) * pieceSz) - 1 ;
                var range ="bytes " + start + "-" + end + "/" + size ;
                var length =end - start + 1 ;
                console.log ('Loading ' + range) ;
                // For resumable (large files), make sure to renew the token first
                //access_token (function () {
                oauthExec ()
                    .then (function (accessToken) {
                        var readStream =fs.createReadStream (file, { 'start': start, 'end': end }) ;
                        return (ossObjects.uploadChunk (bucketKey, fileKey, length, range, sessionId, readStream, {})) ;
                    })
                    .then (function (data) {
                        callback () ;
                        if ( data === undefined )
                            return (console.log ('Partial upload accepted')) ;
                        fs.writeFile (__dirname + '/data/' + bucketKey + '.' + fileKey + '.json', JSON.stringify (data, null, 4), function (err) {
                            if ( err )
                                return (console.error ('Failed to create ' + bucketKey + '.' + fileKey + '.json file')) ;
                        }) ;
                        console.log ('Upload successful') ;
                        console.log ('ID: ' + data.objectId) ;
                        console.log ('URN: ' + new Buffer (data.objectId).toString ('base64')) ;
                        console.log ('Location: ' + data.location) ;
                    })
                    .catch (function (error) {
                        errorHandler (error, 'Failed to upload file') ;
                    })
                ;
            }) ;
    }) ;
}) ;