如何处理 Python Lambda 函数中创建 Secret 的 AWS 速率限制(每秒 50 个请求)?

问题描述 投票:0回答:1

我正在对 AWS Secrets 进行备份和恢复测试,其中我需要使用 Python Lambda 函数从备份 JSON 文件创建大约 1000 个密钥。 AWS 施加每秒 50 个秘密的速率限制。我当前的方法按顺序创建机密,但 Lambda 在创建所有机密之前超时。在后续运行中,将处理剩余的机密。

这是我当前处理秘密创建的函数的摘录:

import base64
import boto3
import time
from botocore.exceptions import ClientError

MAX_REQUESTS_PER_SECOND = 50

def get_secrets_manager_client(region_name):
    return boto3.client('secretsmanager', region_name=region_name)

def create_secrets_batch(secrets):
    success_count = 0
    failed_secrets = []

    for i in range(0, len(secrets), MAX_REQUESTS_PER_SECOND):
        batch = secrets[i:i + MAX_REQUESTS_PER_SECOND]

        for secret in batch:
            name = secret['Name']
            secret_string = base64.b64decode(secret['SecretString']).decode("utf-8")
            print(f"Processing secret: {name}")

            try:
                secrets_manager_client = get_secrets_manager_client('eu-west-1')
                secrets_manager_client.create_secret(
                    Name=name,
                    SecretString=secret_string
                )
                success_count += 1
                print(f"Secret {name} created successfully.")
            except secrets_manager_client.exceptions.ResourceExistsException:
                print(f"Secret {name} already exists.")
            except ClientError as e:
                print(f"Failed to create secret {name}: {str(e)}")
                failed_secrets.append(name)

        # Respect the rate limit of 50 requests per second
        if len(batch) == MAX_REQUESTS_PER_SECOND:
            print("Sleeping for 1 second to avoid rate limit...")
            time.sleep(1)

    return success_count, failed_secrets

完整代码可以在这里

找到

我尝试过的:

  • 研究并发选项,例如
    asyncio
    和线程
  • 考虑将秘密分成批次

我需要什么帮助: 如何将请求限制在每秒 50 个以下,同时在速率限制内有效处理所有 1000 个机密?

python amazon-web-services aws-lambda concurrency boto3
1个回答
-4
投票

当处理 AWS Secrets Manager 并需要创建大量密钥,同时遵守每秒 50 个请求的 API 速率限制时,您可以使用 Python 的并发.futures 模块并仔细计时来实现此目的。

伪代码:

FUNCTION create_secret(secret_manager, secret_name, secret_value):
    TRY:
        Create secret using secret_manager.create_secret()
        RETURN success message
    CATCH ClientError as e:
        IF error is 'ResourceExistsException':
            RETURN "Secret already exists" message
        ELSE:
            RETURN error message

FUNCTION lambda_handler(event, context):
    Initialize secret_manager client
    Load secrets_data from JSON file
    
    SET total_secrets = number of secrets in secrets_data
    SET secrets_created = 0
    SET rate_limit = 50  // AWS API rate limit
    SET batch_size = 50  // Number of secrets to process in each batch
    
    FUNCTION process_batch(batch):
        Initialize empty results list
        CREATE ThreadPoolExecutor with max_workers = rate_limit
            FOR EACH secret_name, secret_value IN batch:
                Submit create_secret task to executor
            FOR EACH completed future:
                Append result to results list
        RETURN results
    
    SET start_time = current time
    
    FOR i = 0 TO total_secrets STEP batch_size:
        Create batch of secrets from secrets_data
        CALL process_batch(batch)
        Update secrets_created count
        
        Print progress
        
        Calculate elapsed_time
        IF elapsed_time < 1 second:
            Sleep for remaining time to complete 1 second
        
        Reset start_time
    
    RETURN success response with secrets created count
© www.soinside.com 2019 - 2024. All rights reserved.