One of the images that i recently noticed on the internet that compares machine learning and Deep learning is this:
What I understood from it is, deep learning performance is not limited when we increase the amount of data for learning "training" ..
So, is this feature restricted to the specific structure of a model!? for example, if I have a Deep neural network with a 10 layers and each layer contains 8 hidden units! if i continue increase the amount of data, Do the performance will still increase, or i have to build another complex one more number of layers and units?