Wednesday 15 July 2015

java - How can I apply multithreading to the backpropagation neural network training? -


For my university project, I am building a neural network that can classify the possibility of credit card transactions or not being fraud. I train with backpopigation. I am writing this in Java I would like to apply multithreading because my computer is a quad-core i7. It bugs me to spend hours training and see most of my core waste.

But how would I apply multithreading to backpropagation? Backprop works by adjusting errors through the network. One layer should be done before continuing to the other. Is there a way to modify my program to backtrack me multicore?

Firstly do not use backspace, there are many other options, I try RPR (flexible promotion) I am here. This will not be big of modification in your backpropagation algorithm. You do not need to specify the rate of learning or speed, in fact it's almost the same that you have a personal, variable, learning rate for each connection in the neural network.

To make the much anticipated application for backspace, I have written an article on this subject.

In fact, I prepare many threads and divide the training data so that there is a near equal amount of each thread. I am computing gradients in each thread and they have been summarized in the shortest stage. How the grading for the load applies depends on the promotional training algorithm, but the weight update is done in the important section.

When you have a lot of training samples in your weight, the time spent in the code is important Important multi-threaded gradient calculation with section load updates

I have some performance on the above link Provide results. It really gives speed to things!


No comments:

Post a Comment