-
Hi everyone, I am trying to use a a custom endpoint URL (S3 Ninja) for S3 emulation when running a Lambda function locally. I can the custom endpoint with either the AWS CLI or I'm unsure as to why S3 NinjaS3 Ninja runs in a docker container, I'm not mapping a volume as I'm happy for all buckets to be disposed of every time the container restarts. docker run -d -p 9444:9000 scireum/s3-ninja:latest The UI can be reached at http://localhost:9444/ui, and the example access and secret keys can be found on the home page. Python ScriptNow I can interact with S3 Ninja using this simple script, and output the results of a import boto3
import json
client = boto3.client(
service_name='s3',
aws_access_key_id='AKIAIOSFODNN7EXAMPLE',
aws_secret_access_key='wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
endpoint_url='http://localhost:9444/s3/'
)
bucket = 'test-bucket'
body = '{"key": "value"}'.encode('utf-8')
key = 'test.json'
client.create_bucket(Bucket=bucket)
client.put_object(Bucket=bucket, Key=key, Body=body)
list_objects = client.list_objects_v2(Bucket=bucket)
print(json.dumps(list_objects, indent=4, default=lambda x: x.isoformat())) Python Script - Output{
"ResponseMetadata": {
"HTTPStatusCode": 200,
"HTTPHeaders": {
"transfer-encoding": "chunked",
"content-type": "application/xml",
"cache-control": "no-cache, max-age=0",
"last-modified": "Fri, 13 May 2022 10:22:50 GMT",
"connection": "keep-alive",
"server": "8f1a1f30b511 (scireum SIRIUS - powered by Netty)",
"p3p": "CP=\"This site does not have a p3p policy.\"",
"vary": "origin"
},
"RetryAttempts": 0
},
"IsTruncated": false,
"Contents": [
{
"Key": "test.json",
"LastModified": "2022-05-13T10:22:50+00:00",
"ETag": "88bac95f31528d13a072c05f2a1cf371",
"Size": 16,
"StorageClass": "STANDARD"
}
],
"Name": "test-bucket",
"Prefix": "",
"MaxKeys": 1000
} Lambda FunctionHowever, if I use the same S3 client code in a Lambda function, execution fails. import boto3
def lambda_handler(event, context):
client = boto3.client(
service_name='s3',
aws_access_key_id='AKIAIOSFODNN7EXAMPLE',
aws_secret_access_key='wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
endpoint_url='http://localhost:9444/s3/'
)
bucket = 'test-bucket'
body = '{"key": "value"}'.encode('utf-8')
key = 'test.json'
client.create_bucket(Bucket=bucket)
client.put_object(Bucket=bucket, Key=key, Body=body)
list_objects = client.list_objects_v2(Bucket=bucket)
return {
'statusCode': int(list_objects['ResponseMetadata']['HTTPStatusCode']),
'body': list_objects,
'headers': {
'AWS-Request-ID': context.aws_request_id,
'Content-Type': 'application/json'
}
} Lambda Function - ResponseExecution of this Lambda function now results in an error.
Invalid lambda response received: Invalid API Gateway Response Keys: {'errorMessage', 'stackTrace', 'errorType'} in {'errorMessage': 'Could not connect to the endpoint URL: "http://localhost:9444/s3/test-bucket"', 'errorType': 'EndpointConnectionError', 'stackTrace': [<moved below to make it more readable]} Lambda Function - Stack Trace
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
OK, so this question is now closed... I found out after posting this that each invocation of my Lambda function is run in a docker container when using Happily I found a Stack Overflow answer that suggested using the default network gateway IP address in place of localhost, so now everything works. import boto3
def lambda_handler(event, context):
client = boto3.client(
service_name='s3',
aws_access_key_id='AKIAIOSFODNN7EXAMPLE',
aws_secret_access_key='wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
endpoint_url='http://172.17.0.1:9444/s3/'
)
bucket = 'test-bucket'
body = '{"key": "value"}'.encode('utf-8')
key = 'test.json'
client.create_bucket(Bucket=bucket)
client.put_object(Bucket=bucket, Key=key, Body=body)
list_objects = client.list_objects_v2(Bucket=bucket)
return {
'statusCode': int(list_objects['ResponseMetadata']['HTTPStatusCode']),
'body': list_objects,
'headers': {
'AWS-Request-ID': context.aws_request_id,
'Content-Type': 'application/json'
}
} |
Beta Was this translation helpful? Give feedback.
OK, so this question is now closed...
I found out after posting this that each invocation of my Lambda function is run in a docker container when using
sam local start-api
. It then because obvious that the SAM container couldn't talk to S3 Ninja on localhost, and that the issue was not related toboto3
in anyway.Happily I found a Stack Overflow answer that suggested using the default network gateway IP address in place of localhost, so now everything works.