EconPapers    
Economics at your fingertips  
 

Distributed algorithm to train neural networks using the Map Reduce paradigm

Cristian Mihai Barca and Claudiu Dan Barca
Additional contact information
Cristian Mihai Barca: Electronics, Communications and Computers, University of Pitesti, Romania
Claudiu Dan Barca: The Romanian-American University, Bucharest, Romania

Database Systems Journal, 2017, vol. 8, issue 1, 3-11

Abstract: With rapid development of powerful computer systems during past decade, parallel and distributed processing becomes a significant resource for fast neural network training, even for real-time processing. Different parallel computing based methods have been proposed in recent years for the development of system performance. The two main methods are to distribute the patterns that are used for training - training set level parallelism, or to distribute the computation performed by the neural network - neural network level parallelism. In the present research work we have focused on the first method.

Keywords: Artificial Neural Networks; Machine Learning; Map-Reduce Hadoop; Distributed System (search for similar items in EconPapers)
Date: 2017
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://www.dbjournal.ro/archive/27/27_1.pdf (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:aes:dbjour:v:8:y:2017:i:1:p:3-11

Access Statistics for this article

Database Systems Journal is currently edited by Ion Lungu

More articles in Database Systems Journal from Academy of Economic Studies - Bucharest, Romania Contact information at EDIRC.
Bibliographic data for series maintained by Adela Bara ().

 
Page updated 2025-03-19
Handle: RePEc:aes:dbjour:v:8:y:2017:i:1:p:3-11