Stop Google document ai processor billing
I am using Google Document AI with custom processors deployed in Google Cloud. I want to stop using them for a year and halt the billing without deleting them. Is there a way to pause billing and reactivate the processors later?
custom classifier/splitter dataset test limit
I am currently working on a project that utilizes the docai custom classifier. I have a question regarding the test dataset size limitations.
As I understand, the current limit for the test dataset size is 2,000 documents. However, for my project, I would like to increase the number of test samples beyond this limit, I have a total of 20k+ documents which is within the training dataset limits. Could you please advise on the best way to achieve this?
Is there a way to bypass or increase the 2,000 document limit for the test dataset? If so, what are the steps I need to follow to do so? Additionally, are there any considerations or potential implications I should be aware of when working with a larger test dataset?
I would greatly appreciate your guidance on this matter. Your expertise in the GCP Document AI service would be invaluable in helping me address this requirement for my project.
Converting document object to dataframe csv with document ai toolbox
This code sample from google cloud docs is supposed to produce output as csv, html or markdown files, but all that is in the output is ‘Tables in Document’, when run in a Google Colab notebook: