我有一个Lambda函数,在S3对象放置后调用。它连接到另一个EC2实例并运行bash脚本。我已经确认了bash脚本,并且lambda函数之外的Python代码可以正常工作。然而,包装它会产生我无法弄清楚的相同错误。附加到lambda的角色似乎附加了所有必需的EC2和S3所需的策略。没有VPC附加到lambda函数。
码:
import boto3
import botocore
import paramiko
def lambda_handler(event, context):
s3_client = boto3.client('s3')
s3_client.download_file('mycluster', 'keys/ec2box.pem',
'/tmp/ec2box.pem')
k = paramiko.RSAKey.from_private_key_file('/tmp/ec2box.pem')
c = paramiko.SSHClient()
c.set_missing_host_key_policy(paramiko.AutoAddPolicy())
c.connect(hostname='99.99.9999', username='centos', pkey=k)
commands = [
"cd /home/dir1/;chmod +x file.sh;nohup ./file.sh > logs/program"
]
for command in commands:
print
"Executing {}".format(command)
stdin, stdout, stderr = c.exec_command(command)
print
stdout.read()
print
stderr.read()
return
{
'message': "Script execution completed. See Cloudwatch logs for complete output"
}
我的错误
[Errno 110] Connection timed out: error
Traceback (most recent call last):
File "/var/task/pythonprogram.py", line 17, in lambda_handler
c.connect(hostname='99.99.9999', username='centos', pkey=k)
File "/var/task/paramiko/client.py", line 338, in connect
retry_on_signal(lambda: sock.connect(addr))
File "/var/task/paramiko/util.py", line 279, in retry_on_signal
return function()
File "/var/task/paramiko/client.py", line 338, in <lambda>
retry_on_signal(lambda: sock.connect(addr))
File "/usr/lib64/python2.7/socket.py", line 228, in meth
return getattr(self._sock,name)(*args)
error: [Errno 110] Connection timed out
有任何想法吗?
想出来了,说实话,这显然是如此明显,但希望它能帮助一个Google员工。查看您正在SSHing的EC2实例的安全组,并确保它实际上是打开的。我被锁定得相当紧张,而我所要做的就是添加一个新的TCP入站规则以允许所需的流量。