Though it arouses more and more curiosity, the HJ iterative algorithm has never been derived in mathematical terms to date. We attempt in this paper to describe it from a statistical point of view. For instance the updating term of the synaptic efficacies matrix cannot be the gradient of a single C2 functional contrary to what is sometimes understood. In fact, we show that the HJ algorithm is actually searching common zeros of n functionals by pipelined stochastic iterations. Based on simulation results, advantages and limitations as well as possible improvements are pointed out after a short theoretical analysis.