Lets just say that I am working on a project to provide best possible food recomendation
I have two column for the learning
Food, Sentence
Pizza, I am having a date with an italian
Chicken wing, I want to have a cheap dinner today
I am using a random forest right now
And have around say 500k data
Everytime I make it learn, my laptop 16gb ram just ran out, and could not allocate bytes,
Is there a way to overcome this, I tried to incremental learning it, but it turns out just soo bad. The accuracy decrease like crazy and I don’t even know why.
Its good enough with 10k data my laptop is burning inside out but the model joblib can be used succesfully, but its just could not allocate memory when I am using more than 20k data
Maybe another method with the two column that can either be incremental learning or just get many data at once without taking too much memory?
Thanks
justin habel K is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.