How to scp to ec2 instance via ssm agent using boto3 and send file

10,002

If you have SSH over SSM setup, you can just use normal scp, like so:

scp file.txt ec2-user@i-04843lr540028e96a

If it isn't working, make sure you have:

  • Session Manager plugin installed locally
  • Your key pair on the instance and locally (you will need to define it in your ssh config, or via the -i switch)
  • SSM agent on the instance (installed by default on Amazon Linux 2)
  • An instance role attached to the instance that allows Session Manager (it needs to be there at boot, so if you just attached, reboot)

Reference: https://docs.aws.amazon.com/systems-manager/latest/userguide/session-manager-getting-started-enable-ssh-connections.html

If you need more detail, give me more info on your setup, and I'll try and help.

Share:
10,002

Related videos on Youtube

Naggappan Ramukannan
Author by

Naggappan Ramukannan

I love programming and linux http://www.naggappan.wordpress.com https://www.linkedin.com/in/naggappan-ramukannan-77a68523

Updated on June 04, 2022

Comments

  • Naggappan Ramukannan
    Naggappan Ramukannan almost 2 years

    Hi need to transfer a file to ec2 machine via ssm agent. I have successfully installed ssm-agent in ec2 instances and from UI i am able to start session via "session-manager" and login to the shell of that ec2 machine.

    Now I tried to automate it via boto3 and using the below code,

    ssm_client = boto3.client('ssm', 'us-west-2') 
    resp = client.send_command(
    DocumentName="AWS-RunShellScript", # One of AWS' preconfigured documents
    Parameters={'commands': ['echo "hello world" >> /tmp/test.txt']},
    InstanceIds=['i-xxxxx'],
    )
    

    The above works fine and i am able to send create a file called test.txt in remote machine but his is via echo command Instead I need to send a file from my local machine to this remove ec2 machine via ssm agent, hence I did the following ,

    Modified the "/etc/ssh/ssh_config" with proxy as below,

    # SSH over Session Manager
    host i-* mi-*
        ProxyCommand sh -c "aws ssm start-session --target %h --document-name AWS-StartSSHSession --parameters 'portNumber=%p'"
    

    Then In above code, I have tried to start a session with below code and that is also successfully .

    response = ssm_client.start_session(Target='i-04843lr540028e96a')
    

    Now I am not sure how to use this session response or use this aws ssm session and send a file

    Environment description: Source: pod running in an EKS cluster dest: ec2 machine (which has ssm agent running) file to be transferred: Important private key which will be used by some process in ec2 machine and it will be different for different machine's

    Solution tried:

    • I can push the file to s3 in source and execute ssm boto3 libaray can pull from s3 and store in the remote ec2 machine
    • But I don't want to do the above due to the reason I don't want to store the private key i s3. So wanted to directly send the file from memory to the remote ec2 machine

    Basically i wanted to achieve scp which is mentioned in this aws document : https://docs.aws.amazon.com/systems-manager/latest/userguide/session-manager-working-with-sessions-start.html#sessions-start-ssh

    • John Rotenstein
      John Rotenstein over 3 years
      It might be easier to "pull" the file into the instance. For example, if the file is stored in Amazon S3, then put a aws s3 cp command in the shell script.
  • Naggappan Ramukannan
    Naggappan Ramukannan over 3 years
    client is a container running in EKS nodes and server is ec2 machine (which has ssm agent installed). The issue is I can't go with keys as then i need to add public key to all my ec2 machine also need to rebuild the container with my private key So i am looking for any solution with boto3, without keys option. One ways is via s3 I can push a file , but here due to security issue i can's use s3 to publish data and then use in my script. So this means I have only 2 option to send a file via ssm, ? 1) via scp (with keys) steps mentioned above 2) via s3
  • Nathan Williams
    Nathan Williams over 3 years
    Ah, I thought you meant your laptop to EC2. EKS to EC2 would need a different solution. What are your other requirements? Is latency an issue, what about retries, what happens on the EC2 when the file is copied? Without more detail, it is hard to suggest something. I'm guessing S3 & using SNS + SQS to notify the server of a new file would probably be a better solution. Please update your original question with more detail on your full requirements, and I'll try again :)
  • Naggappan Ramukannan
    Naggappan Ramukannan over 3 years
    updated my question as its a private key i can't save it to s3 and this key will be generated in my service which is running in EKS as a POD and will be sent to the ec2 machine (note: each machine will send a different key)
  • Naggappan Ramukannan
    Naggappan Ramukannan over 3 years
    And if i use sns/ sqs I need to add new consumers for this purpose in the ec2 machine which is also not possible as these ec2 machines are kind of some appliances
  • Nathan Williams
    Nathan Williams over 3 years
    Hmm, sorry I don't have a good answer. All I can think is you could use EC2 Instance Connect which lets you send a key temporarily to the EC2: docs.aws.amazon.com/AWSEC2/latest/UserGuide/…
  • Matteo
    Matteo over 2 years
    But why do I need ssh keypair? The whole point of SSM is to get rid of SSH keys
  • Nathan Williams
    Nathan Williams over 2 years
    I was suggesting using SSH over SSM (treating SSM as a form of proxy for SSH). As such you still need to authenticate for the SSH connection