I use GitHub Actions to deploy to AWS Lambda. In my code, I want to upload a file to the S3 bucket ‘hello.test.output.data’. However, I mistakenly wrote ‘hello.test
.output.data’ as ‘hello.dev
.output.data’. After running the Lambda, I discovered that the file was uploaded to hello.dev
.output.data. The IAM user I utilized in GitHub Actions did not have permission to access hello.dev
.output.data. Therefore, my question is: Is there anything wrong with my settings?
Below is the details of my setting.
Thank you.
- I have an IAM user named ‘CustomUser-Hello-TEST-GitHubAction’
- This IAM user has a policy named ‘CustomPolicy-Hello-TEST-S3’.
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::hello.test*"
],
"Effect": "Allow",
"Sid": "VisualEditor1"
},
{
"Action": [
"s3:ListStorageLensConfigurations",
"s3:ListAccessPointsForObjectLambda",
"s3:ListBucketMultipartUploads",
"s3:ListAllMyBuckets",
"s3:ListAccessPoints",
"s3:ListJobs",
"s3:ListBucketVersions",
"s3:ListBucket",
"s3:ListMultiRegionAccessPoints",
"s3:ListMultipartUploadParts"
],
"Resource": "*",
"Effect": "Allow",
"Sid": "VisualEditor0"
}
]
}
- I add .github/workflows/test-pipeline.yml
name: Deploy to TEST
on:
push:
branches:
- test
concurrency:
group: ${{ github.workflow }}
cancel-in-progress: true
jobs:
deploy-to-test:
runs-on: ubuntu-latest
environment:
name: test
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
- uses: aws-actions/setup-sam@v2
- uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID_TEST }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY_TEST }}
aws-region: us-east-1
- run: |
npm ci
# compile js to ts
- name: Compile TS
run: npm run compile
# copy package.json to built folder for install packages
- run: cp package.json built/
- run: cp .npmrc built/
# sam build
- run: sam build --use-container --template template.test.yaml
# sam deploy
- run: sam deploy --no-confirm-changeset --no-fail-on-empty-changeset --stack-name Hello-TEST --s3-bucket hello.tmp --capabilities CAPABILITY_IAM --region us-east-1
- the template.test.yaml is used in test-pipeline.yml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
# More info about Globals: https://github.com/awslabs/serverless-application-model/blob/master/docs/globals.rst
Globals:
Function:
MemorySize: 1024
Timeout: 900
Tags:
project: hello
stage: test
Resources:
Hello:
Type: AWS::Serverless::Function # More info about Function Resource: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#awsserverlessfunction
Properties:
CodeUri: ./built
Handler: app.handler
Runtime: nodejs18.x
Role: arn:aws:iam::xxxxxxxx:role/CustomRole-TEST-Lambda
FunctionName: Hello-TEST
Environment: # More info about Env Vars: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#environment-object
Variables:
S3_STORAGE_CLASS: STANDARD
S3_BUCKET_OUTPUT: hello.dev.output.data
MYSQL_HOST: xxxxxxxx
MYSQL_DB: xxxxxxxx
MYSQL_PORT: 3306
VpcConfig:
SecurityGroupIds:
- sg-xxxxxxxx
SubnetIds:
- subnet-xxxxxxxx
- subnet-xxxxxxxx
Events:
MySqsEvent:
Type: SQS # More info about API Event Source: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#api
Properties:
Queue: arn:aws:sqs:us-east-1:xxxxxxxx:Hello-TEST-Jobs
BatchSize: 1
EventInvokeConfig:
MaximumRetryAttempts: 0 #Integer (Min: 0, Max: 2)
Is there anything wrong with my settings?
Shawn is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.