Deep learning methods have got fantastic performance on lots of large-scale datasets formachine learning tasks, such as visual recognition, neural language processing and so on. Most of theprogresses on deep learning in recent years lied on supervised learning, for which the whole dataset withrespect to a specific task should be well-prepared before training. However, in the real-world scenario, thelabeled data associated with the assigned classes are always gathered incrementally over time, since it iscumbersome work to collect and annotate the training data manually. This suggests the manner ofsequentially training on a series of datasets with gradually added training samples belonging to new classes,which is called incremental learning. In this paper, we proposed an effective incremental training methodbased on learning automata for deep neural networks. The main thought is to train a deep model withdynamic connections which can be either activated or deactivated on different datasets of theincremental training stages. Our proposed method can relieve the destruction of old features while learningnew features for the new added training samples, which can lead to better training performance on theincremental learning stage. Experiments on MNIST and CIFAR-100 demonstrated that our method can beimplemented for deep neural models in long sequence of incremental training stages and can achievesuperior performance than training from scratch and the fine-tuning method.
To View the Base Paper Abstract Contents
Now it is Your Time to Shine.
Great careers Start Here.
We Guide you to Every Step
Success! You're Awesome
Thank you for filling out your information!
We’ve sent you an email with your Final Year Project PPT file download link at the email address you provided. Please enjoy, and let us know if there’s anything else we can help you with.
To know more details Call 900 31 31 555
The WISEN Team