我正在运行AWS lambda功能时遇到分段故障 该函数从多个来源获取数据,对其进行处理,并将结果上传到S3存储桶中 数据获取和处理很好,但是该功能在文件上传步骤中崩溃,并带有以下错误:

问题描述 投票:0回答:0

lambda日志显示,凭据是从环境变量正确加载的,并且上传开始,但不久之后就崩溃了。 我尝试增加内存分配和执行时间而没有成功

def upload_to_s3(dataframe, bucket_name, file_name): """Upload DataFrame as CSV to S3.""" logger.info(f"Uploading file to S3 bucket {bucket_name} with key {file_name}...") retries = 3 for attempt in range(retries): try: with tempfile.NamedTemporaryFile(delete=False) as temp_file: dataframe.to_csv(temp_file.name, index=False) temp_file.seek(0) s3 = boto3.client('s3') with open(temp_file.name, 'rb') as f: s3.put_object(Bucket=bucket_name, Key=file_name, Body=f) os.remove(temp_file.name) logger.info(f"File uploaded to S3: {bucket_name}/{file_name}") break except ClientError as e: if attempt == retries - 1: logger.error(f"S3 upload failed after {retries} attempts: {e}") raise else: logger.warning(f"S3 upload attempt {attempt + 1} failed, retrying...") time.sleep(2)
在S3上传期间诊断或解决此细分故障的任何见解吗?它可能与boto3,lambda环境限制或依赖性问题有关吗?

任何帮助都会受到赞赏。

您检查了您的CSV文件尺寸是多少?

由于整个CSV文件都被写入to之前,然后被读回上传之前,可能会超过温度存储。默认情况下,Lambda为您提供512 MB的存储空间,但现在可以达到10 GB。

lambda短暂存储更新

    

python amazon-web-services amazon-s3 aws-lambda segmentation-fault
最新问题
© www.soinside.com 2019 - 2025. All rights reserved.