Download PDFOpen PDF in browser

A survey on Distributed learning System

EasyChair Preprint no. 2041

5 pagesDate: November 28, 2019

Abstract

Machine learning has been widely used for scientific research and occupational purposes newly to extract valuable information. A most important experiment comes from the communication cost in the distributed computing environment. In specific, composed the iterative nature of many machine learning algorithms and the vastness of the models and the training data require a huge amount of communication amid different machines in the training process. To one side from the actual computational cost that is joint among multiple machines, distributed computing invites additional cost of communication overhead and machine synchronization. The conventional privacy-preserving distributed machine learning approaches emphasis on the simple distributed system architectures, which requires heavy computation loads or can only provide learning systems over the restricted scenarios. The proposed scheme not only reduces the overhead for the learning process but also be responsible for the comprehensive shield for each layer of the hierarchical distributed system.

Keyphrases: distributed, hierarchical, machine learning

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:2041,
  author = {S Mugesh and G Sharmila},
  title = {A survey on Distributed learning System},
  howpublished = {EasyChair Preprint no. 2041},

  year = {EasyChair, 2019}}
Download PDFOpen PDF in browser