Im newer to working with s3 and opensearch but got thrust into trying to solve a logs issue. I know how to pull down a folder from my bucket, but that results in way more logs than i need. I need to pull down only logs with “account_id_pin” in _source.log.data
. Is there a way to write a bash script that would parse logs, and only pull those down? once i have them locally, i just need to add to its value with a script and sync them back. i couldnt find anything in aws. I know theres the dev tools in openSearch but it wont let me run an update_by_query at this level(unless im completely mistaken). I just am trying to avoid having to pull down a bunch of logs that wont be touched.
This was a script i tried in opensearch Dev tools to update them, but no luck:
“script”: {
“source”: “””
if (ctx.source.containsKey(‘log’) && ctx.source.log.containsKey(‘data’) && ctx.source.log.data.containsKey(‘account_id_pin’)) {
ctx.source.log.data.account_id_pin = “xxx__newValue“;
}
if (ctx._source.containsKey('log_processed') && ctx._source.log_processed.containsKey('msg') && ctx._source.log_processed.msg.containsKey('data') && ctx._source.log_processed.msg.payment.containsKey('account_id_pin')) {
ctx._source.log_processed.msg.data.account_id_pin = "xxx__newValue_____";
}
""",