Artificial neural networks are increasingly beingused for a variety of machine learning problems. However,increased density of interconnections in artificial neural networksleads to high computational and power requirements. One wayto reduce the power is to reduce the number of interconnectionswhich can be achieved using LASSO techniques. In this paper,we propose an alternative smoothing function to LASSO regularization and an incremental pruning algorithm on feedforwardartificial neural networks with an aim of achieving maximallysparse networks with minimal performance degradation. Wecompare the results obtained using the proposed smoothingfunction with the existing smoothing functions. Further, we alsoevaluate the performance of the proposed incremental pruningalgorithm.
To View the Base Paper Abstract Contents
Now it is Your Time to Shine.
Great careers Start Here.
We Guide you to Every Step
Success! You're Awesome
Thank you for filling out your information!
We’ve sent you an email with your Final Year Project PPT file download link at the email address you provided. Please enjoy, and let us know if there’s anything else we can help you with.
To know more details Call 900 31 31 555
The WISEN Team