Riak-CS Node.js client

Patrick F. Marques patrickfmarques at gmail.com
Mon Mar 9 06:05:56 EDT 2015


Thanks! I'll stay tuned ;)
till there I will patch my code an pray not to introduce any issues here..
The patch is bellow for anyone to try (just send the decoded string to be
signed)

>From a439f6126317a2b66fc08baf31b24b47e8ec4ed9 Mon Sep 17 00:00:00 2001
From: "Patrick F. Marques" <patrickmarques at baboom.com>
Date: Thu, 26 Feb 2015 14:59:14 +0000
Subject: [PATCH] [fix]

---
 lib/signers/s3.js |    2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/lib/signers/s3.js b/lib/signers/s3.js
index 2f49fff..604e8a0 100644
--- a/lib/signers/s3.js
+++ b/lib/signers/s3.js
@@ -73,7 +73,7 @@ AWS.Signers.S3 = inherit(AWS.Signers.RequestSigner, {

     var headers = this.canonicalizedAmzHeaders();
     if (headers) parts.push(headers);
-    parts.push(this.canonicalizedResource());
+    parts.push(decodeURIComponent(this.canonicalizedResource()));

     return parts.join('\n');

On Thu, Feb 26, 2015 at 2:13 PM, Kota Uenishi <kota at basho.com> wrote:

> > The s3 signer uses the signs the "canonicalizedResource" and that have
> the query parameters already encoded, so I tried to replace the "%3D" by
> the "=" and it already works.
>
> Yey! The culprit is here. Most client mistakenly encodes Multipart
> uploadId although it is already supposed to be url-encoded. This is
> the case for #1063, too. Maybe Riak CS can be aligned to how S3
> behaves to save most S3 clients - stay tuned to that issue, please.
> Anyway, thank you for reporting!
>
>
>
> On Thu, Feb 26, 2015 at 9:54 PM, Patrick F. Marques
> <patrickfmarques at gmail.com> wrote:
> > Hi,
> >
> > thanks for your help Uenishi.
> >
> > I'm using Riak 1.5.2, and AWS Node.js SDK 2.1.14 and the example code I'm
> > running is bellow.
> > I have beed trying with and without forcing a singing version. With some
> > debug I found that the default is the use the s3 signer.... If I force
> v2 I
> > have another error, "Cannot set property 'Timestamp' of undefined" that
> is
> > throe by v2.js signer code, I made a simple fix but then every request
> > returns "Access Denied".
> >
> > The s3 signer uses the signs the "canonicalizedResource" and that have
> the
> > query parameters already encoded, so I tried to replace the "%3D" by the
> "="
> > and it already works.
> >
> >
> > // ----------------------------------
> >
> > 'use strict';
> >
> > var fs = require('fs');
> > var path = require('path');
> > var zlib = require('zlib');
> >
> > var config = {
> >     accessKeyId: 'WDH-HCBBZONGEY2PADRC',
> >     secretAccessKey: '9nJpf_C3hoaGrMBbvWH_pJ7qQT5ijrQKrN2XVg==',
> >     // region: 'eu'
> >
> >     httpOptions: {
> >         proxy: 'http://192.168.56.100:8080'
> >     },
> >
> >     signatureVersion: 'v2'
> > };
> >
> > var bigfile = path.join('./', 'bigfile');
> > var body = fs.createReadStream(bigfile).pipe(zlib.createGzip());
> >
> > var AWS = require('aws-sdk');
> > var s3 = new AWS.S3(new AWS.Config(config));
> >
> > var params = {
> >     Bucket: 'test',
> >     Key: 'myKey',
> >     Body: body
> > };
> >
> > s3.upload(params).
> >     on('httpUploadProgress', function(evt) { console.log(evt); }).
> >     send(function(err, data) {
> >         console.log(err, data);
> >     });
> >
> > // ----------------------------------
> >
> > Bets Regards,
> > Patrick Marques
> >
> >
> > On Thu, Feb 26, 2015 at 6:47 AM, Kota Uenishi <kota at basho.com> wrote:
> >>
> >> Hi,
> >>
> >> My 6th sense says you're hitting this problem:
> >> https://github.com/basho/riak_cs/issues/1063
> >>
> >> Could you give me an example of code or debug print of Node.js client
> that
> >> includes the source string before being signed by a secret key?
> >>
> >> Otherwise maybe that client is just using v4 authentication which we
> >> haven't yet supported. To avoid it, please try v2 authentication.
> >>
> >> 2015/02/26 9:06 "Patrick F. Marques" <patrickfmarques at gmail.com>:
> >>>
> >>> Hi everyone,
> >>>
> >>> I'm trying to use AWS SDK as a S3 client for Riack CS to upload large
> >>> objects that I usually don't know the its size, for that propose I'm
> trying
> >>> to use the multipart upload like in the SDK example
> >>>
> https://github.com/aws/aws-sdk-js/blob/master/doc-src/guide/node-examples.md#amazon-s3-uploading-an-arbitrarily-sized-stream-upload
> .
> >>> The problem is that I'm always getting Access Denied.
> >>>
> >>> I've been trying some other clients but also without success.
> >>>
> >>> Best regards,
> >>> Patrick Marques
> >>>
> >>>
> >>>
> >>> _______________________________________________
> >>> riak-users mailing list
> >>> riak-users at lists.basho.com
> >>> http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
> >>>
> >
>
>
>
> --
> Kota UENISHI / @kuenishi
> Basho Japan KK
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.basho.com/pipermail/riak-users_lists.basho.com/attachments/20150309/b290679c/attachment-0002.html>


More information about the riak-users mailing list