An Innovative Reliable Client-Centric Deep Learning Inference Methodology
Keywords:
Client-centric learning, Privacy, Homomorphic Encryption, Blockchain, Distributed LedgerAbstract
Mobile phones and tablets have access to a very huge amount data that may be utilized to train learning models, potentially improving the user experience significantly. Nevertheless, the data available is often both extensive and sensitive, making it challenging to collect at centralize server and train within a centralized server using conventional methods. In this study, we investigate the utilization of blockchain technology with decentralized digital ledger to create a decentralized client-centric distributed learning system with the flexibility to support various machine learning models. This system enables the training of machine learning models directly on local machines, thereby addressing the constraints imposed by centralized servers. We demonstrate our system design, which includes two decentralized blockchain models built using Python Tensor Flow to ensure the system's reliability and efficiency. Ultimately, Block-CCL serves as an experimental environment for evaluating and distinguishing the impact of decentralized client centric i.e. federated learning from synchronization of model methods on the performance of the entire system. This highlights the validity and effectiveness of a federated learning system as a viable alternative to more centralized machine learning models.
Downloads
References
S. Zhou, H. Huang, W. Chen, P. Zhou, Z. Zheng, & S. Guo, “PiRATE: A Blockchain-Based Secure Framework of Distributed Machine Learning in 5G Networks,” IEEE Netw., vol. 34, no. 6, pp. 84–91, Nov. 2020, doi: 10.1109/MNET.001.1900658.
Y. J. Kim and C. S. Hong, “Blockchain-based Node-aware Dynamic Weighting Methods for Improving Federated Learning Performance,” 2019 20th Asia-Pacific Netw. Oper. Manag. Symp. Manag. a Cyber-Physical World, APNOMS 2019, Sep. 2019, doi: 10.23919/APNOMS.2019.8893114.
U. Majeed and C. S. Hong, “FLchain: Federated Learning via MEC-enabled Blockchain Network,” 2019 20th Asia-Pacific Netw. Oper. Manag. Symp. Manag. a Cyber-Physical World, APNOMS 2019, Sep. 2019, doi: 10.23919/APNOMS.2019.8892848.
X. Bao, C. Su, Y. Xiong, W. Huang, and Y. Hu, “FLChain: A Blockchain for Auditable Federated Learning with Trust and Incentive,” Proc. - 5th Int. Conf. Big Data Comput. Commun. BIGCOM 2019, pp. 151–159, Aug. 2019, doi: 10.1109/BIGCOM.2019.00030.
Y. Li, C. Chen, N. Liu, H. Huang, Z. Zheng, and Q. Yan, “A Blockchain-Based Decentralized Federated Learning Framework with Committee Consensus,” IEEE Netw., vol. 35, no. 1, pp. 234–241, Mar. 2021, doi: 10.1109/MNET.011.2000263.
G. Wang, C. X. Dang, and Z. Zhou, “Measure Contribution of Participants in Federated Learning,” Proc. - 2019 IEEE Int. Conf. Big Data, Big Data 2019, pp. 2597–2604, Dec. 2019, doi: 10.1109/BIGDATA47090.2019.9006179.
M. Sundararajan and A. Najmi, “The many Shapley values for model explanation,” 37th Int. Conf. Mach. Learn. ICML 2020, vol. PartF168147-12, pp. 9210–9220, Aug. 2019.
T. Hai, J. Zhou, S. R. Srividhya, S. K. Jain, P. Young, and S. Agrawal, “BVFLEMR: an integrated federated learning and blockchain technology for cloud-based medical records recommendation system,” J. Cloud Comput., vol. 11, no. 1, pp. 0–15, 2022, doi: 10.1186/s13677-022-00294-6.
Y. Zhang et al., “Blockchain-Based Practical and Privacy-Preserving Federated Learning with Verifiable Fairness,” Math. 2023, Vol. 11, Page 1091, vol. 11, no. 5, p. 1091, Feb. 2023, doi: 10.3390/MATH11051091.
O. El Rifai, M. Biotteau, X. de Boissezon, I. Megdiche, F. Ravat, and O. Teste, “Blockchain-Based Federated Learning in Medicine,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 12299 LNAI, pp. 214–224, 2020, doi: 10.1007/978-3-030-59137-3_20/COVER.
Z. Wang, B. Yan, and A. Dong, “Blockchain Empowered Federated Learning for Data Sharing Incentive Mechanism,” Procedia Comput. Sci., vol. 202, pp. 348–353, Jan. 2022, doi: 10.1016/J.PROCS.2022.04.047.
M. Shayan, C. Fung, C. J. M. Yoon, and I. Beschastnikh, “Biscotti: A Blockchain System for Private and Secure Federated Learning,” IEEE Trans. Parallel Distrib. Syst., vol. 32, no. 7, pp. 1513–1525, Jul. 2021, doi: 10.1109/TPDS.2020.3044223.
J. E. Short, K. Miyachi, C. Toouli, and S. Todd, “A field test of a federated learning/federated analytic blockchain network implementation in an HPC environment,” Front. Blockchain, vol. 5, p. 893747, Aug. 2022, doi: 10.3389/FBLOC.2022.893747.
Y. Chang, C. Fang, and W. Sun, “A blockchain-based federated learning method for smart healthcare,” Comput. Intell. Neurosci., vol. 2021, 2021, doi: 10.1155/2021/4376418.
M. Andrychowicz et al., “Learning to learn by gradient descent by gradient descent,” Adv. Neural Inf. Process. Syst., pp. 3988–3996, Jun. 2016.
J. Verbraeken, M. Wolting, J. Katzy, J. Kloppenburg, T. Verbelen, and J. S. Rellermeyer, “A Survey on Distributed Machine Learning,” ACM Comput. Surv., vol. 53, no. 2, Dec. 2019, doi: 10.1145/3377454.
L. Bottou, “Large-scale machine learning with stochastic gradient descent,” Proc. COMPSTAT 2010 - 19th Int. Conf. Comput. Stat. Keynote, Invit. Contrib. Pap., pp. 177–186, 2010, doi: 10.1007/978-3-7908-2604-3_16/COVER.
P. Kairouz et al., “Advances and Open Problems in Federated Learning,” Found. Trends Mach. Learn., vol. 14, no. 1–2, pp. 1–210, Jun. 2021, doi: 10.1561/2200000083.
Y. Liu, Y. Kang, C. Xing, T. Chen, and Q. Yang, “A Secure Federated Transfer Learning Framework,” 2020, doi: 10.1109/MIS.2020.2988525.
Downloads
Published
How to Cite
Issue
Section
License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.
IJISAE open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.