I’m working on a Backup & Restore test for AWS Secrets, where I need to create about 1000 secrets from a backup JSON file using a Python Lambda function. AWS imposes a rate limit of 50 secrets per second. My current approach creates secrets sequentially, but the Lambda times out before all secrets are created. On a subsequent run, the remaining secrets are processed.
Here is an excerpt of my current function handling the creation of secrets:
import base64
import boto3
import time
from botocore.exceptions import ClientError
MAX_REQUESTS_PER_SECOND = 50
def get_secrets_manager_client(region_name):
return boto3.client('secretsmanager', region_name=region_name)
def create_secrets_batch(secrets):
success_count = 0
failed_secrets = []
for i in range(0, len(secrets), MAX_REQUESTS_PER_SECOND):
batch = secrets[i:i + MAX_REQUESTS_PER_SECOND]
for secret in batch:
name = secret['Name']
secret_string = base64.b64decode(secret['SecretString']).decode("utf-8")
print(f"Processing secret: {name}")
try:
secrets_manager_client = get_secrets_manager_client('eu-west-1')
secrets_manager_client.create_secret(
Name=name,
SecretString=secret_string
)
success_count += 1
print(f"Secret {name} created successfully.")
except secrets_manager_client.exceptions.ResourceExistsException:
print(f"Secret {name} already exists.")
except ClientError as e:
print(f"Failed to create secret {name}: {str(e)}")
failed_secrets.append(name)
# Respect the rate limit of 50 requests per second
if len(batch) == MAX_REQUESTS_PER_SECOND:
print("Sleeping for 1 second to avoid rate limit...")
time.sleep(1)
return success_count, failed_secrets
The complete code can be found here
What I’ve tried:
- Investigated concurrency options like
asyncio
and threading - Considered splitting the secrets into batches
What I need help with:
How can I limit the requests to below 50 per second while processing all 1000 secrets efficiently within the rate limits?
3
When dealing with AWS Secrets Manager and the need to create a large number of secrets while respecting the API rate limit of 50 requests per second, you can use Python’s concurrent.futures module along with careful timing to achieve this.
Pseudo code:
FUNCTION create_secret(secret_manager, secret_name, secret_value):
TRY:
Create secret using secret_manager.create_secret()
RETURN success message
CATCH ClientError as e:
IF error is 'ResourceExistsException':
RETURN "Secret already exists" message
ELSE:
RETURN error message
FUNCTION lambda_handler(event, context):
Initialize secret_manager client
Load secrets_data from JSON file
SET total_secrets = number of secrets in secrets_data
SET secrets_created = 0
SET rate_limit = 50 // AWS API rate limit
SET batch_size = 50 // Number of secrets to process in each batch
FUNCTION process_batch(batch):
Initialize empty results list
CREATE ThreadPoolExecutor with max_workers = rate_limit
FOR EACH secret_name, secret_value IN batch:
Submit create_secret task to executor
FOR EACH completed future:
Append result to results list
RETURN results
SET start_time = current time
FOR i = 0 TO total_secrets STEP batch_size:
Create batch of secrets from secrets_data
CALL process_batch(batch)
Update secrets_created count
Print progress
Calculate elapsed_time
IF elapsed_time < 1 second:
Sleep for remaining time to complete 1 second
Reset start_time
RETURN success response with secrets created count
2