I am trying to make a dashboard using data from CloudWatch logs. On AWS Batch, I have a job queue (I’ll call this A). On this queue A, I have some jobs that succeeded and some jobs failed. When I click on the failed jobs, I can click on Log stream name to access the logs on CloudWatch under log events. My goal is to use AWS SDK to automatically upload the logs for the failed jobs to either AWS RDS or S3, so that I can access the data. Is there a way to do this with AWS SDK in python? Also, is there a way to make that python script run whenever there is a failed job on AWS Batch queue A? or maybe whenever the user accesses the dashboard on QuickSight? Thank you.
I was able to download the csv file from CloudWatch, which means if I am able to upload the logs for all the failed jobs, it would be possible for me to display them on QuickSight. I have gone through the AWS SDK document for a while, but couldn’t figure out how to do this.
Ho Jung Kim is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.