Concepts, Key Challenges and Open Problems of Federated Learning

Document Type : Original Article

Authors

1 School of Computer Sciences, Universiti Sains Malaysia, Pulau Pinang, Malaysia; Department of Computer Science, University of Gujrat, Gujrat, Pakistan

2 School of Computer Sciences, Universiti Sains Malaysia, Pulau Pinang, Malaysia

Abstract

With the modern invention of high-quality sensors and smart chips with high computational power, smart devices like smartphones and smart wearable devices are becoming primary computing sources for routine life. These devices, collectively, might possess an enormous amount of valuable data but due to privacy concerns and privacy laws like General Data Protection Regulation (GDPR), this enormous amount of very valuable data is not available to train models for more accurate and efficient AI applications. Federated Learning (FL) has emerged as a very prominent collaborative learning technique to learn from such decentralized private data while reasonably satisfying the privacy constraints. To learn from such decentralized and massively distributed data, federated learning needs to overcome some unique challenges like system heterogeneity, statistical heterogeneity, communication, model heterogeneity, privacy, and security. In this article, to begin with, we explain some fundamentals of federated learning along with the definition and applications of FL. Subsequently, we further explain the unique challenges of FL while critically covering recently proposed approaches to handle them. Furthermore, this paper also discusses some relatively novel challenges for federated learning. To conclude, we discuss some future research directions in the domain of federated learning.

Keywords


1.     Taigman, Y., Yang, M., Ranzato, M.A. and Wolf, L., "Deepface: Closing the gap to human-level performance in face verification", in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE, (2014), 1701-1708.
2.     Simard, P.Y., Steinkraus, D. and Platt, J.C., "Best practices for convolutional neural networks applied to visual document analysis", in Proceedings of the International Conference on Document Analysis and Recognition, ICDAR, (2003), 958-963.
3.     Salehi, S.M.M. and Pouyan, A.A., "Detecting overlapping communities in social networks using deep learning", International Journal of Engineering Transactions C: Aspects,  Vol. 33, No. 3, (2020), 366-376, DOI: 10.5829/ije.2020.33.03c.01.
4.     Hannun, A., Case, C., Casper, J., Catanzaro, B., Diamos, G., Elsen, E., Prenger, R., Satheesh, S., Sengupta, S., Coates, A. and Ng, A.Y., Deep speech: Scaling up end-to-end speech recognition, in ArXiv. 2014.
5.     Gheitasi, A., Farsi, H. and Mohamadzadeh, S., "Estimation of hand skeletal postures by using deep convolutional neural networks", International Journal of Engineering, Transactions A: Basics,  Vol. 33, No. 4, (2020), 552-559, DOI: 10.5829/ije.2020.33.04a.06.
6.     Shaeiri, Z. and Kazemitabar, S.J., "Fast unsupervised automobile insurance fraud detection based on spectral ranking of anomalies", International Journal of Engineering, Transactions A: Basics,  Vol. 33, No. 7, (2020), 1240-1248, DOI: 10.5829/ije.2020.33.07a.10.
7.     Savadi Hosseini, M. and Ghaderi, F., "A hybrid deep learning architecture using 3d cnns and grus for human action recognition", International Journal of Engineering, Transactions B: Applications,  Vol. 33, No. 6, (2020), 959-965, DOI: 10.5829/ije.2020.33.05b.29.
8.     Sezavar, A., Farsi, H. and Mohamadzadeh, S., "A modified grasshopper optimization algorithm combined with convolutional neural network for content based image retrieval", International Journal of Engineering, Transactions A: Basics,  Vol. 32, No. 7, (2019), 924-930, DOI: 10.5829/ije.2019.32.07a.04.
9.     Ghanbari Sorkhi, A., Hassanpour, H. and fateh, m., "Improvement of the r-fcn's deep network in object detection and annotation", Journal of Machine Vision and Image Processing,  Vol. 6, No. 2, (2020), 43-59.
10.   Poushter, J. "Smartphone ownership and internet usage continues to climb in emerging economies." Pew research center 22, no. 1 (2016): 1-44.
11.   Neuromation, What’s the deal with “ai chips” in the latest smartphones? 2018.
12.   Blackmer, W.S., Eu general data protection regulation. 2018.
13.   KPMG, Overview of china’s cybersecurity law. Kpmg advisory (china) limited. 2017.
14.   California privacy rights act | californians for consumer privacy. 2020.
15.   Shokri, R. and Shmatikov, V., "Privacy-preserving deep learning", in Proceedings of the ACM Conference on Computer and Communications Security, New York, USA, ACM Press, (2015), 1310-1321.
16.   Konečný, J., McMahan, H.B., Yu, F.X., Richtárik, P., Suresh, A.T. and Bacon, D., Federated learning: Strategies for improving communication efficiency, in ArXiv. 2016.
17.   Brendan McMahan, H., Moore, E., Ramage, D., Hampson, S. and Agüera y Arcas, B., "Communication-efficient learning of deep networks from decentralized data", in Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017, (2017).
18.   Dean, J., Corrado, G.S., Monga, R. and Chen, K., "Large scale distributed deep networks", in Neural Information Processing Systems'12, (2012), 1223-1231.
19.   Shamir, O., Srebro, N. and Zhang, T., "Communication-efficient distributed optimization using an approximate newton-type method", in 31st International Conference on Machine Learning, ICML 2014, (2014), 2665-2681.
20.   Reddi, S.J., Konečný, J., Richtárik, P., Póczós, B. and Smola, A., Aide: Fast and communication efficient distributed optimization, in ArXiv. 2016.
21.   Ma, C., Konečný, J., Jaggi, M., Smith, V., Jordan, M.I., Richtárik, P. and Takáč, M., "Distributed optimization with arbitrary local solvers", Optimization Methods and Software,  Vol. 32, No. 4, (2017), 813-848, DOI: 10.1080/10556788.2016.1278445.
22.   Yang, Q., Liu, Y., Chen, T. and Tong, Y., "Federated machine learning: Concept and applications", ACM Transactions on Intelligent Systems and Technology,  Vol. 10, No. 2, (2019), 1-19, DOI: 10.1145/3298981.
23.   Kulkarni, V., Kulkarni, M. and Pant, A., "Survey of personalization techniques for federated learning", in Proceedings of the World Conference on Smart Trends in Systems, Security and Sustainability, WS4 2020, Institute of Electrical and Electronics Engineers Inc., (2020), 794-797.
24.   Xia, Y., "Watermarking federated deep neural network models",  (2020),
25.   Lyu, L., Yu, H., Zhao, J. and Yang, Q., Threats to federated learning, in Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics). 2020.3-16.
26.   Li, Q., Wen, Z., Wu, Z., Hu, S., Wang, N., Li, Y., Liu, X. and He, B., A survey on federated learning systems: Vision, hype and reality for data privacy and protection, in ArXiv. 2019.
27.   Cui, L., Yang, S., Chen, F., Ming, Z., Lu, N. and Qin, J., "A survey on application of machine learning for internet of things", International Journal of Machine Learning and Cybernetics,  Vol. 9, No. 8, (2018), 1399-1417, DOI: 10.1007/s13042-018-0834-5.
28.   Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A. and Smith, V., Federated optimization in heterogeneous networks, in ArXiv. 2018.1-28.
29.   Niknam, S., Dhillon, H.S. and Reed, J.H., "Federated learning for wireless communications: Motivation, opportunities, and challenges", IEEE Communications Magazine,  Vol. 58, No. 6, (2020), 46-51, DOI: 10.1109/mcom.001.1900461.
30.   Tran, N.H., Bao, W., Zomaya, A., Nguyen, M.N.H. and Hong, C.S., "Federated learning over wireless networks: Optimization model design and analysis", in IEEE INFOCOM 2019 - IEEE Conference on Computer Communications, Institute of Electrical and Electronics Engineers Inc., (2019), 1387-1395.
31.   Kairouz, P., Brendan McMahan, H., Avent, B., Bellet, A., Bennis, M., Bhagoji, A.N., Bonawitz, K., Charles, Z., Cormode, G., Cummings, R., D'Oliveira, R.G.L., Rouayheb, S.E., Evans, D., Gardner, J., Garrett, Z., Gascón, A., Ghazi, B., Gibbons, P.B., Gruteser, M., Harchaoui, Z., He, C., He, L., Huo, Z., Hutchinson, B., Hsu, J., Jaggi, M., Javidi, T., Joshi, G., Khodak, M., Konečný, J., Korolova, A., Koushanfar, F., Koyejo, S., Lepoint, T., Liu, Y., Mittal, P., Mohri, M., Nock, R., Özgür, A., Pagh, R., Raykova, M., Qi, H., Ramage, D., Raskar, R., Song, D., Song, W., Stich, S.U., Sun, Z., Suresh, A.T., Tramèr, F., Vepakomma, P., Wang, J., Xiong, L., Xu, Z., Yang, Q., Yu, F.X., Yu, H. and Zhao, S., Advances and open problems in federated learning, in ArXiv. 2019.
32.   Aledhari, M., Razzak, R., Parizi, R.M. and Saeed, F., "Federated learning: A survey on enabling technologies, protocols, and applications", IEEE Access,  Vol. 8, No., (2020), 140699-140725, DOI: 10.1109/access.2020.3013541.
33.   Li, T., Sahu, A.K., Talwalkar, A. and Smith, V., "Federated learning: Challenges, methods, and future directions", IEEE Signal Processing Magazine,  Vol. 37, No. 3, (2020), 50-60, DOI: 10.1109/msp.2020.2975749.
34.   Mothukuri, V., Parizi, R.M., Pouriyeh, S., Huang, Y., Dehghantanha, A. and Srivastava, G., "A survey on security and privacy of federated learning", Future Generation Computer Systems,  Vol. 115, No., (2021), 619-640, DOI: 10.1016/j.future.2020.10.007.
35.   TensorFlow, Tensorflow/federated: A framework for implementing federated learning, in GitHub. 2020.
36.   Caldas, S., Duddu, S.M.K., Wu, P., Li, T., Konečný, J., McMahan, H.B., Smith, V. and Talwalkar, A., Leaf: A benchmark for federated settings, in ArXiv. 2018.
37.   Ryffel, T., Trask, A., Dahl, M., Wagner, B., Mancuso, J., Rueckert, D. and Passerat-Palmbach, J., A generic framework for privacy preserving deep learning, in ArXiv. 2018.
38.   FedAI.org, Fate-federated ai ecosystem. 2019.
39.   PaddleFL, Paddlefl. 2019.
40.   Clara, N., Nvidia clara | nvidia developer. 2019.
41.   OWKIN, Federated learning - owkin.
42.   Chai, D., Chen, K., Wang, L. and Yang, Q., Fedeval: A benchmark system with a comprehensive evaluation model for federated learning, in ArXiv. 2020, arXiv.
43.   He, C., Li, S., So, J., Zhang, M., Wang, H., Wang, X., Vepakomma, P., Singh, A., Qiu, H., Shen, L., Zhao, P., Kang, Y., Liu, Y., Raskar, R., Yang, Q., Annavaram, M. and Avestimehr, S., Fedml: A research library and benchmark for federated machine learning, in ArXiv. 2020.
44.   Hu, S., Li, Y., Liu, X., Li, Q., Wu, Z. and He, B., The oarf benchmark suite: Characterization and implications for federated learning systems, in ArXiv. 2020.
45.   Li, S., Cheng, Y., Liu, Y., Wang, W. and Chen, T., Abnormal client behavior detection in federated learning, in ArXiv. 2019.
46.   Liu, Y.Y.Y., Huang, A., Luo, Y., Huang, H., Liu, Y.Y.Y., Chen, Y., Feng, L., Chen, T., Yu, H. and Yang, Q., Fedvision: An online visual object detection platform powered by federated learning, in ArXiv. 2020.
47.   Huang, L., Shea, A.L., Qian, H., Masurkar, A., Deng, H. and Liu, D., "Patient clustering improves efficiency of federated machine learning to predict mortality and hospital stay time using distributed electronic medical records", Journal of biomedical informatics, Vol. 99, (2019), 103291, DOI: 10.1016/j.jbi.2019.103291.
48.   Ramaswamy, S., Mathews, R., Rao, K. and Beaufays, F., Federated learning for emoji prediction in a mobile keyboard, in ArXiv. 2019.
49.   Jiang, J., Ji, S. and Long, G., "Decentralized knowledge acquisition for mobile internet applications", World Wide Web,  Vol. 23, No. 5, (2020), 2653-2669, DOI: 10.1007/s11280-019-00775-w.
50.   Liu, Y., Liu, Y., Liu, Z., Liang, Y., Meng, C., Zhang, J. and Zheng, Y., "Federated forest", IEEE Transactions on Big Data,  Vol. 10.1109/tbdata.2020.2992755, (2020), 1-1, DOI: 10.1109/tbdata.2020.2992755.
51.   Li, X., Gu, Y., Dvornek, N., Staib, L.H., Ventola, P. and Duncan, J.S., "Multi-site fmri analysis using privacy-preserving federated learning and domain adaptation: Abide results", Medical Image Analysis, Vol. 65, (2020), 101765, DOI: 10.1016/j.media.2020.101765.
52.   Chen, S., Xue, D., Chuai, G., Yang, Q. and Liu, Q., "Fl-qsar: A federated learning-based qsar prototype for collaborative drug discovery", Bioinformatics,  Vol. 36, No. 22-23, (2021), 5492-5498, DOI: 10.1093/bioinformatics/btaa1006.
53.   Chen, D., Xie, L.J., Kim, B., Wang, L.C., Hong, C.S., Wang, L.C. and Han, Z., "Federated learning based mobile edge computing for augmented reality applications", in 2020 International Conference on Computing, Networking and Communications, ICNC 2020, (2020).
54.   Hartmann, F., Suh, S., Komarzewski, A., Smith, T.D. and Segall, I., Federated learning for ranking browser history suggestions, in ArXiv. 2019.
55.   Qi, T., Wu, F., Wu, C., Huang, Y. and Xie, X., Fedrec: Privacy-preserving news recommendation with federated learning, in ArXiv. 2020.
56.   Asad, M., Moustafa, A. and Ito, T., "Fedopt: Towards communication efficiency and privacy preservation in federated learning", Applied Sciences,  Vol. 10, No. 8, (2020), 2864, DOI: 10.3390/app10082864.
57.   Preuveneers, D., Rimmer, V., Tsingenopoulos, I., Spooren, J., Joosen, W. and Ilie-Zudor, E., "Chained anomaly detection models for federated learning: An intrusion detection case study", Applied Sciences,  Vol. 8, No. 12, (2018), 2663, DOI: 10.3390/app8122663.
58.   Brisimi, T.S., Chen, R., Mela, T., Olshevsky, A., Paschalidis, I.C. and Shi, W., "Federated learning of predictive models from federated electronic health records", International Journal of Medical Informatics,  Vol. 112, (2018), 59-67, DOI: 10.1016/j.ijmedinf.2018.01.007.
59.   Jeong, E., Oh, S., Kim, H., Park, J., Bennis, M. and Kim, S.-L., "Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data", in Neural Information Processing Systems, (2018).
60.   Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D. and Chandra, V., Federated learning with non-iid data, in ArXiv. 2018.
61.   Duan, M., Liu, D., Chen, X., Tan, Y., Ren, J., Qiao, L. and Liang, L., "Astraea: Self-balancing federated learning for improving classification accuracy of mobile deep learning applications", in Proceedings - 2019 IEEE International Conference on Computer Design, ICCD 2019, (2019), 246-254.
62.   Smith, V., Chiang, C.-k., Sanjabi, M. and Talwalkar, A., "Federated multi-task learning", in Neural Information Processing Systems, (2017), 4427-4437.
63.   Corinzia, L. and Buhmann, J.M., "Variational federated multi-task learning", in NeurIPS 2019 Workshop on Federated Learning for Data Privacy and Confidentiality, (2019).
64.   Wang, J., Kolar, M. and Srebro, N., "Distributed multi-task learning", in Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, (2016), 751-760.
65.   Sattler, F., Müller, K.-R. and Samek, W., Clustered federated learning: Model-agnostic distributed multi-task optimization under privacy constraints, in ArXiv. 2019.1-12.
66.   Lim, W.Y.B., Luong, N.C., Hoang, D.T., Jiao, Y., Liang, Y.-C., Yang, Q., Niyato, D. and Miao, C., "Federated learning in mobile edge networks: A comprehensive survey", IEEE Communications Surveys & Tutorials,  Vol. 22, No. 3, (2020), 2031-2063, DOI: 10.1109/comst.2020.2986024.
67.   Xiao, P., Cheng, S., Stankovic, V. and Vukobratovic, D., "Averaging is probably not the optimum way of aggregating parameters in federated learning", Entropy (Basel),  Vol. 22, No. 3, (2020), 314, DOI: 10.3390/e22030314.
68.   Li, Q., He, B. and Song, D., Model-agnostic round-optimal federated learning via knowledge transfer, in ArXiv. 2020.
69.   Yurochkin, M., Agarwal, M., Ghosh, S., Greenewald, K., Hoang, T.N. and Khazaeni, Y., "Bayesian nonparametric federated learning of neural networks", in 36th International Conference on Machine Learning, ICML 2019, International Machine Learning Society (IMLS), (2019), 12583-12597.
70.   Wang, H., Yurochkin, M., Sun, Y., Khazaeni, Y. and Papailiopoulos, D., Federated learning with matched averaging, in ArXiv. 2020.
71.   Liu, L. and Zheng, F., A bayesian federated learning framework with multivariate gaussian product, in ArXiv. 2021.
72.   Sim, K.C., Zadrazil, P. and Beaufays, F., "An investigation into on-device personalization of end-to-end automatic speech recognition models", in Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH, ISCA, (2019), 774-778.
73.   Wang, K., Mathews, R., Kiddon, C., Eichner, H., Beaufays, F. and Ramage, D., Federated evaluation of on-device personalization, in ArXiv. 2019.
74.   Schneider, J. and Vlachos, M., Personalization of deep learning, in ArXiv. 2019.
75.   Mansour, Y., Mohri, M., Ro, J. and Suresh, A.T., Three approaches for personalization with applications to federated learning, in ArXiv. 2020.
76.   Finn, C., Abbeel, P. and Levine, S., "Model-agnostic meta-learning for fast adaptation of deep networks", in 34th International Conference on Machine Learning, ICML 2017, (2017), 1856-1868.
77.   Jiang, Y., Konečný, J., Rush, K. and Kannan, S., Improving federated learning personalization via model agnostic meta learning, in ArXiv. 2019.
78.   Khodak, M., Balcan, M.-F. and Talwalkar, A., Adaptive gradient-based meta-learning methods, in ArXiv. 2019.
79.   Fallah, A., Mokhtari, A. and Ozdaglar, A., Personalized federated learning: A meta-learning approach, in ArXiv. 2020.
80.   Li, D. and Wang, J., "Fedmd: Heterogenous federated learning via model distillation", in NeurIPS 2019 Workshop on Federated Learning for Data Privacy and Confidentiality, (2019).
81.   Ma, J., Yonetani, R. and Iqbal, Z., " Adaptive Distillation for Decentralized Learning from Heterogeneous Clients,", 2020 25th International Conference on Pattern Recognition (ICPR), 2021, pp. 7486-7492, doi: 10.1109/ICPR48806.2021.9412356.
82.   Caruana, R., "Multitask learning, Learning to learn, ed. S. Thrun and L. Pratt, Boston, MA, Springer US, 10.1007/978-1-4615-5529-2_5,  (1998),  95-133.
83.   Liu, S., Pan, S.J. and Ho, Q., "Distributed multi-task relationship learning", in Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, (2017), 937-946.
84.   Ruder, S., An overview of multi-task learning in deep neural networks, in ArXiv. 2017.
85.   Hinton, G., Vinyals, O. and Dean, J., Distilling the knowledge in a neural network, in ArXiv. 2015.
86.   Konečný, J., McMahan, H.B., Ramage, D. and Richtárik, P., Federated optimization: Distributed machine learning for on-device intelligence, in ArXiv. 2016.
87.   Sprague, M.R., Jalalirad, A., Scavuzzo, M., Capota, C., Neun, M., Do, L. and Kopp, M., "Asynchronous federated learning for geospatial applications", in Communications in Computer and Information Science, Springer Verlag, (2019), 21-28.
88.   Stripelis, D. and Ambite, J.L., Semi-synchronous federated learning, in ArXiv. 2021.
89.   Wu, W., He, L., Lin, W., Mao, R. and Jarvis, S., Safa: A semi-asynchronous protocol for fast federated learning with low overhead, in ArXiv. 2019.
90.   Liu, Y., Yuan, X., Zhao, R., Zheng, Y. and Zheng, Y., Rc-ssfl: Towards robust and communication-efficient semi-supervised federated learning system; rc-ssfl: Towards robust and communication-efficient semi-supervised federated learning system, in ArXiv. 2018.
91.   Chen, T., Jin, X., Sun, Y. and Yin, W., Vafl: A method of vertical asynchronous federated learning, in ArXiv. 2020.
92.   van Dijk, M., Nguyen, N.V., Nguyen, T.N., Nguyen, L.M., Tran-Dinh, Q. and Nguyen, P.H., Asynchronous federated learning with reduced number of rounds and with differential privacy from less aggregated gaussian noise, in ArXiv. 2020.
93.   Li, X., Qu, Z., Tang, B. and Lu, Z., Stragglers are not disaster: A hybrid federated learning algorithm with delayed gradients, in ArXiv. 2021.
94.   Baytas, I.M., Yan, M., Jain, A.K. and Zhou, J., "Asynchronous multi-task learning", in Proceedings - IEEE International Conference on Data Mining, ICDM, (2017), 11-20.
95.   Zhang, H., Li, J., Kara, K., Alistarh, D., Liu, J. and Zhang, C., "Zipml: Training linear models with end-to-end low precision, and a little bit of deep learning", in 34th International Conference on Machine Learning, ICML 2017, (2017), 6132-6140.
96.   Wang, H., Sievert, S., Charles, Z., Liu, S., Wright, S. and Papailiopoulos, D., "Atomo: Communication-efficient learning via atomic sparsification", in Advances in Neural Information Processing Systems, (2018), 9850-9861.
97.   Guha, N., Talwalkar, A. and Smith, V., One-shot federated learning, in ArXiv. 2019.
98.   Seide, F., Fu, H., Droppo, J., Li, G. and Yu, D., "1-bit stochastic gradient descent and its application to data-parallel distributed training of speech dnns", in Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH, (2014), 1058-1062.
99.   Wen, W., Xu, C., Yan, F., Wu, C., Wang, Y., Chen, Y. and Li, H., "Terngrad: Ternary gradients to reduce communication in distributed deep learning", in Advances in Neural Information Processing Systems, (2017), 1510-1520.
100. Strom, N., "Scalable distributed dnn training using commodity gpu cloud computing", in Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH, (2015), 1488-1492.
101. Aji, A.F. and Heafield, K., "Sparse communication for distributed gradient descent", in EMNLP 2017 - Conference on Empirical Methods in Natural Language Processing, Proceedings, Stroudsburg, PA, USA, Association for Computational Linguistics, (2017), 440-445.
102. Lin, Y., Han, S., Mao, H., Wang, Y. and Dally, W.J., "Deep gradient compression: Reducing the communication bandwidth for distributed training", in ICLR 2017, (2017).
103. Nishio, T. and Yonetani, R., "Client selection for federated learning with heterogeneous resources in mobile edge", in IEEE International Conference on Communications, (2019), 1-7.
104. Chen, C.Y., Choi, J., Brand, D., Agrawal, A., Zhang, W. and Gopalakrishnan, K., "Adacomp: Adaptive residual gradient compression for data-parallel distributed training", in 32nd AAAI Conference on Artificial Intelligence, AAAI 2018, (2018), 2827-2835.
105. Ahmed, A., Das, A. and Smola, A.J., "Scalable hierarchical multitask learning algorithms for conversion optimization in display advertising", in WSDM 2014 - Proceedings of the 7th ACM International Conference on Web Search and Data Mining, New York, USA, ACM Press, (2014), 153-162.
106. Mateos-Núñez, D., Cortés, J., Mateos-Núñez, D. and Cortes, J., "Distributed optimization for multi-task learning via nuclear-norm approximation∗∗the authors are with the department of mechanical and aerospace engineering, university of california, san diego, USA", IFAC-PapersOnLine,  Vol. 48, No. 22, (2015), 64-69, DOI: 10.1016/j.ifacol.2015.10.308.
107. Smith, V., Forte, S., Ma, C., Takác, M., Jordan, M.I. and Jaggi, M., "Cocoa: A general framework for communication-efficient distributed optimization", Journal of Machine Learning Research,  Vol. 18, (2018),  DOI: 10.3929/ethz-b-000282738.
108. Anil, R., Pereyra, G., Passos, A., Ormandi, R., Dahl, G.E. and Hinton, G.E., "Large scale distributed neural network training through online distillation", in 6th International Conference on Learning Representations, ICLR 2018 - Conference Track Proceedings, (2018), 1-12.
109. Tramèr, F., Zhang, F., Juels, A., Reiter, M.K. and Ristenpart, T., "Stealing machine learning models via prediction apis", in Proceedings of the 25th USENIX Security Symposium, USENIX Association, (2016), 601-618.
110. Melis, L., Song, C., De Cristofaro, E. and Shmatikov, V., "Exploiting unintended feature leakage in collaborative learning", in Proceedings - IEEE Symposium on Security and Privacy, (2019), 691-706.
111. Wei, W., Liu, L., Loper, M., Truex, S., Yu, L., Gursoy, M.E. and Wu, Y., Adversarial examples in deep learning: Characterization and divergence, in ArXiv. 2018.
112. Bagdasaryan, E., Veit, A., Hua, Y., Estrin, D. and Shmatikov, V., How to backdoor federated learning, in ArXiv. 2018.
113. Wang, H., Sreenivasan, K., Rajput, S., Vishwakarma, H., Agarwal, S., Sohn, J.-y., Lee, K. and Papailiopoulos, D., Attack of the tails: Yes, you really can backdoor federated learning, in ArXiv. 2020.
114. Biggio, B., Corona, I., Maiorca, D., Nelson, B., Šrndić, N., Laskov, P., Giacinto, G. and Roli, F., "Evasion attacks against machine learning at test time", in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), (2013), 387-402.
115. Abadi, M., McMahan, H.B., Chu, A., Mironov, I., Zhang, L., Goodfellow, I. and Talwar, K., "Deep learning with differential privacy", in Proceedings of the ACM Conference on Computer and Communications Security, (2016), 308-318.
116. Geyer, R.C., Klein, T. and Nabi, M., Differentially private federated learning: A client level perspective, in ArXiv. 2017.
117. Hitaj, B., Ateniese, G. and Perez-Cruz, F., "Deep models under the gan: Information leakage from collaborative deep learning", in Proceedings of the ACM Conference on Computer and Communications Security, New York, NY, USA, Association for Computing Machinery, (2017), 603-618.
118. Bogdanov, D., Laur, S. and Willemson, J., "Sharemind: A framework for fast privacy-preserving computations", in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Berlin, Heidelberg, Springer Berlin Heidelberg, (2008), 192-206.
119. Araki, T., Furukawa, J., Lindell, Y., Nof, A. and Ohara, K., "High-throughput semi-honest secure three-party computation with an honest majority", in Proceedings of the ACM Conference on Computer and Communications Security, New York, NY, USA, Association for Computing Machinery, (2016), 805-817.
120. Mohassel, P. and Zhang, Y., "Secureml: A system for scalable privacy-preserving machine learning", in Proceedings - IEEE Symposium on Security and Privacy, (2017), 19-38.
121. Mohassel, P. and Rindal, P., "Aby3: A mixed protocol framework for machine learning", in Proceedings of the ACM Conference on Computer and Communications Security, New York, NY, USA, Association for Computing Machinery, (2018), 35-52.
122. Bonawitz, K., Ivanov, V., Kreuter, B., Marcedone, A., McMahan, H.B., Patel, S., Ramage, D., Segal, A. and Seth, K., "Practical secure aggregation for privacy-preserving machine learning", in Proceedings of the ACM Conference on Computer and Communications Security, New York, USA, ACM Press, (2017), 1175-1191.
123. Mandal, K. and Gong, G., "Privfl: Practical privacy-preserving federated regressions on high-dimensional data over mobile networks", in Proceedings of the ACM Conference on Computer and Communications Security, (2019).
124. Gentry, C., "Fully homomorphic encryption using ideal lattices", in Proceedings of the Annual ACM Symposium on Theory of Computing, New York, New York, USA, ACM Press, (2009), 169-178.
125. Coron, J.S., Lepoint, T. and Tibouchi, M., "Scale-invariant fully homomorphic encryption over the integers", in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Verlag, (2014), 311-328.
126. Brakerski, Z., Gentry, C. and Vaikuntanathan, V., "(leveled) fully homomorphic encryption without bootstrapping", in ITCS 2012 - Innovations in Theoretical Computer Science Conference, New York, New York, USA, ACM Press, (2012), 309-325.
127. Bourse, F., Minelli, M., Minihold, M. and Paillier, P., "Fast homomorphic evaluation of deep discretized neural networks", in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Verlag, (2018), 483-512.
128. Acar, A., Aksu, H., Uluagac, A.S. and Conti, M., "A survey on homomorphic encryption schemes", ACM Computing Surveys,  Vol. 51, No. 4, (2018), 1-35, DOI: 10.1145/3214303.
129. Xu, G., Li, H., Liu, S., Yang, K. and Lin, X., "Verifynet: Secure and verifiable federated learning", IEEE Transactions on Information Forensics and Security,  Vol. 15, No., (2020), 911-926, DOI: 10.1109/tifs.2019.2929409.
130. Itahara, S., Nishio, T., Koda, Y., Morikura, M. and Yamamoto, K., Distillation-based semi-supervised federated learning for communication-efficient collaborative training with non-iid private data, in ArXiv. 2020.
131. Van Berlo, B., Saeed, A. and Ozcelebi, T., "Towards federated unsupervised representation learning", in EdgeSys 2020 - Proceedings of the 3rd ACM International Workshop on Edge Systems, Analytics and Networking, Part of EuroSys 2020, New York, NY, USA, Association for Computing Machinery, (2020), 31-36.
132. Tian, Y., Krishnan, D. and Isola, P., "Contrastive multiview coding", in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Science and Business Media Deutschland GmbH, (2020), 776-794.
133. Hénaff, O.J., Srinivas, A., De Fauw, J., Razavi, A., Doersch, C., Eslami, S.M.A. and Van Den Oord, A., Data-efficient image recognition with contrastive predictive coding, in ArXiv. 2020.
134. Chen, T., Kornblith, S., Norouzi, M. and Hinton, G., A simple framework for contrastive learning of visual representations, in ArXiv. 2020.
135. Li, Q. and Song, D., Model-contrastive federated learning, in ArXiv. 2021.
136. Misra, I. and van der Maaten, L., "Self-supervised learning of pretext-invariant representations", in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE Computer Society, (2020), 6706-6716.
137. He, K., Fan, H., Wu, Y., Xie, S. and Girshick, R., "Momentum contrast for unsupervised visual representation learning", in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE Computer Society, (2020), 9726-9735.
138. Lin, J., Du, M. and Liu, J., Free-riders in federated learning: Attacks and defenses, in ArXiv. 2019.
139. Zong, B., Song, Q., Min, M.R., Cheng, W., Lumezanu, C., Cho, D.-k. and Chen, H., "Deep autoencoding gaussian mixture model for unsupervised anomaly detection", in 6th International Conference on Learning Representations, ICLR 2018 - Conference Track Proceedings, (2018).
140. Karimireddy, S.P., Kale, S., Mohri, M., Reddi, S.J., Stich, S.U. and Suresh, A.T., Scaffold: Stochastic controlled averaging for on-device federated learning, in ArXiv. 2019.