I have a lot of data that can be used to train a model - so much that I am not sure if my computer (16GB Ram) can handle all of the data at once. What are some ways to deal with this issue, given my hardware limitations, to build a gradient boosting tree or neural network?
Are there ways to train the model using subsets of the data? I am able to break my data into subsets, if needed. I can pull however much data I want at a time from my database (so files themselves can be small). I have been using lightgbm for boosting and plan to use keras for NN.
Thanks!