DynamoDB import CSV from s3 not importing all items
I am using the import from s3 option to import a csv to a DynamoDB tables. This has worked fine before but for this particular csv of 500 items, AWS is only importing 100 of them. It seems like DynamoDB is importing the first 50 and last 50 items in the csv and ignoring the rest.
Simple Cost Effective Approach to Maintaining a Simple Database (on AWS)
I would like to keep track of about 10 million username/country pairs daily and track changes. So I will run my code once a day and would like to efficiently have a history of country info for each username. If I was running this on my local computer, I can make a simple sqlite database and track accordingly. But this project requires a bit more robust approach where I will be continuously run the code on an EC2 instance and update the database. I was reading on DynamoDB, S3 etc and cannot figure out what is an expert programmer’s approach to this situation.
How To Control Object Type When Importing Large File From S3 To DynamoDB
I have a csv sitting in an S3 Bucket about 900,000 rows long, and within that csv I have two columns phone
and ttl
.