I currently have a notebook on Colab Enterprise that accesses environment variables from a .env file to get access into a database to do ETL. When I run the notebook manually using a runtime instance, with the .env file uploaded into the working directory, it works fine; however, when I schedule this notebook to execute, using Colab Enterprise’s schedule feature, the notebook is unable to access the environment variables (variables = None) and I am not sure where or how I should be passing these variables into notebook on a scheduled execution (the outputs currently go into a GCP bucket). I’ve read only that I may need to set up the environment using a python script before the execution, but I am not sure how to go about this.
I access the environment variables using variable = os.getenv('variable_name')
from a .env file in the working directory.
I have tried uploading the .env file in the GCP bucket, hoping the scheduled execution’s directory would be inside the bucket to access the file, but to no avail. The current working directory output I get from the scheduled notebook execution is “/mnt/executor/scratch”, however, I am not sure how to load the .env file in this location, and whether this is the right way to go about it.