This is my training data: 200,000 Examples x 10,000 Features. So my training data matrix is - 200,000 x 10,000.
I managed to save this in a flat file without having memory issues by saving every dataset one by one (one example after another) as I generate the features for each example.
But, now when I use Milk, SVMlight, or any other Machine Learning Algorithm, everything tries to load the whole training data into memory instead of training it one by one. However I just have 8 GB RAM, so I cannot proceed this way.
Do you know of anyway I could train the algorithm one dataset by one dataset? I.e., so that at any instant I just have one dataset loaded into memory, while training.