I’m trying to get the total number of provisioned of read capacity units of a DynamoDB table to calculate the cost of one specific table in a month, using CloudWatch metrics. This table has been switched from Provisioned to On-Demand, all of its metric statistics has been reset, so I don’t know how many RCU has been used when it was in Provisioned mode. I’m also not allowed to manage tags used for calculating cost, so Cost Explorer is pretty much a “no go”.
The target utilization of both RCU and WCU for all tables in the region is 50%.
My cli command is something like below:
aws cloudwatch get-metric-statistics
--namespace "AWS/DynamoDB"
--metric-name "ProvisionedReadCapacityUnits"
--dimensions Name=TableName,Value=my_table
--start-time 2024-04-01T00:00:00Z
--end-time 2024-04-30T23:59:59Z
--period 3600 --statistics Sum
--query "Datapoints[:].Sum"
// Result:
// [
// 120.0,
// 120.0,
// 125.0,
// 100.0,
// ...
// ]
However, after getting the command result and adding them all up, the total number I got for this one table was way higher (almost 10 times) than the number of PRCU used for all tables in the region in that month, written in the billing report.
How can I get the precise total number of capacity units used in one month, that matches the billing report?
If not, how can I get the cost of one specific table back in that time ?