I am faced with the challenge of training with large-scale data, specifically about 19 terabytes of video data. Creating the model isn’t difficult, but I’m not sure where to store this massive amount of data and how to use it. Since we don’t have high-performance computers, it seems we might need to rent some. I am curious about how AI developers who handle large-scale data typically manage such situations.
Additionally, I have found that AWS could be used, but I wonder if this method is actually employed or if there are better alternatives available.
Day Han is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.