I have a daily file that I want to process using OpenAI API. This is a text file containing approximately 4 MB and 250,000 lines containing names. Using TikToken, I could calculate that each file as approximately 1.5 million tokens (model gpt-3.5-turbo).
Using the ChatGPT, I can instantaneously upload the file and ask the intended question and get the result.
However, to do this programmatically using the APIs the only way I know is using batch processing – which is tremendously slow and often hits the rate limiting (I’m currently on Tier 1).
Question: Is there any faster way to achieve my goal? Can I use the file upload and then query the API with a question to the uploaded file ID?