Dynamo Local from Node-aws: all operations fail "Cannot do operations on a non-existent table"

26,124

Solution 1

The problem is that the JavaScript console and your app use different profiles (credential and region) and therefore DynamoDB local will create separate database files for them. By using the -sharedDb flag when starting the local DynamoDB, a single database file will be shared for all clients.

From the doc:

-sharedDb — DynamoDB Local will use a single database file, instead of using separate files for each credential and region. If you specify -sharedDb, all DynamoDB Local clients will interact with the same set of tables regardless of their region and credential configuration.

Solution 2

Those who are using the official DynamoDB Local Docker image should use this line to start it to enable sharedDb:

docker run -p 8000:8000 amazon/dynamodb-local -jar DynamoDBLocal.jar -inMemory -sharedDb

The original ENTRYPOINT and CMD used by the image can be seen in docker inspect amazon/dynamodb-local output and are:

"Entrypoint": [
    "java"
]
"Cmd": [
    "-jar",
    "DynamoDBLocal.jar",
    "-inMemory"
]

So we basically need to copy them and add -sharedDb.

Share:
26,124

Related videos on Youtube

Rog
Author by

Rog

iOS guy valiantly trying to wrestle JavaScript. Also available in short format: sockettrousers but that is really boring.

Updated on July 09, 2022

Comments

  • Rog
    Rog almost 2 years

    I have a local dynamo-db running. I have set up my tables using the JavaScript console and they list OK from there.

    I can also put and get items to my tables from the JavaScript console:

    var params = { TableName:"environmentId", Item: { environmentId: {"S":"a4fe1736-98cf-4560-bcf4-cc927730dd1b"} }};
    dynamodb.putItem(params, function(err, data) {
        console.log("put : err was " + JSON.stringify(err) + " and data is " + JSON.stringify(data));
    });
    

    prints put : err was null and data is {} which I'm assuming is "success" because

    params = { "Key":{"environmentId":{"S":"a4fe1736-98cf-45e0-bcf4-cc927730dd1b"}},"TableName":"environmentId"}
    dynamodb.getItem(params, function(err, data) {
        console.log("get : err was " + JSON.stringify(err) + " and data is " + JSON.stringify(data));
    });
    

    prints get : err was null and data is {"Item":{"environmentId":{"S":"a4fe1736-98cf-45e0-bcf4-cc927730dd1b"}}} i.e. it retrieves the object I just put to the table.

    However, if it fire up the node REPL and type:

    var AWS = require('aws-sdk');
    AWS.config.loadFromPath("./config/credentials.js");
    endpoint = new AWS.Endpoint("http://localhost:8000");
    var dynamoOpts = {apiVersion: '2012-08-10', 'endpoint':endpoint};
    var dynamodb = new AWS.DynamoDB(dynamoOpts);
    var params = { TableName:"environmentId", Item: { environmentId: {"S":"a4fe1736-98cf-4560-bcf4-cc927730dd1b"} }};
    dynamodb.putItem(params, function(err, data) {
        console.log("put : err was " + JSON.stringify(err) + " and data is " + JSON.stringify(data));
    }
    

    I get a resource not found error:

    { "message":"Cannot do operations on a non-existent table",
        "code":"ResourceNotFoundException",
        "time":"2015-04 10T10:01:26.319Z",
        "statusCode":400,
        "retryable":false,
        "retryDelay":0
    }
    

    The ASW.request object returned from the putCommand has the correct endpoint:

    { protocol: 'http:',
        host: 'localhost:8000',
        port: 8000,
        hostname: 'localhost',
        pathname: '/',
        // etc.
    

    The same thing happens from my Node app however the same code connecting to the real AWS hosted dynamo works.

    • Brett Bim
      Brett Bim over 4 years
      this could be tagged with other languages too, same behavior for .net.
  • Rog
    Rog about 9 years
    Thanks. The fact that the JavaScript console would use a different db was not at all clear from that doc line for me.
  • Richard Lee
    Richard Lee almost 7 years
    You can use a docker container with volumes setted - docker run -d -p 8000:8000 -v /tmp/data:/data/ dwmkerr/dynamodb -dbPath /data/ run-amazon-dynamodb-locally-with-docker Note: You need to setup a Access Key Id on Dynamo JS shell to: DynamoDB Local loses tables?
  • wassgren
    wassgren over 5 years
    There is an official docker image available from Amazon now on Docker hub. hub.docker.com/r/amazon/dynamodb-local
  • Jason
    Jason almost 5 years
    dear god you are a life saver!
  • dz902
    dz902 over 4 years
    well what can i say, i owe you like 40 hours of life
  • S A
    S A almost 4 years
    I am facing this error 'Unrecognized option: -inMemory '
  • gigi2
    gigi2 about 3 years
    Thank you so much! I wish I saw this earlier
  • vastlysuperiorman
    vastlysuperiorman over 2 years
    I personally had control over everything, so it was better for me to keep the in-memory tables and just make sure that regions and credentials were identical in all test fixtures. Though, I did have to run tcpdump on port 8000 before I found the last set of bad credentials.
  • Shannon
    Shannon over 2 years
    Alternatively, make sure all your clients use the same access key ID and the right region.