In current deep learning models, centralized architectureforces participants to pool their data to the central Cloudto train a global model, while distributed architecture requiresa parameter server to mediate the training process. However,privacy issues, response delays, computation and communicationbottlenecks prevent these architectures from working well at thescale of IoT devices. To counter these problems, we build a Fogembeddedprivacy-preserving deep learning framework calledFPPDL, which moves computation from the centralized Cloudto Fog nodes near the end devices. The experimental results onbenchmark image datasets under different settings demonstratethat FPPDL achieves comparable accuracy to the centralizedstochastic gradient descent (SGD) framework, and delivers betteraccuracy than the standalone SGD framework. Our evaluationsalso show that both computation and communication cost aregreatly reduced by FPPDL, hence achieving the desired tradeoffbetween privacy and performance.
To View the Base Paper Abstract Contents
Now it is Your Time to Shine.
Great careers Start Here.
We Guide you to Every Step
Success! You're Awesome
Thank you for filling out your information!
We’ve sent you an email with your Final Year Project PPT file download link at the email address you provided. Please enjoy, and let us know if there’s anything else we can help you with.
To know more details Call 900 31 31 555
The WISEN Team