Activation Function, Initializer function, etc, effects on neural networks for face detection

Posted by harry on Stack Overflow See other posts from Stack Overflow or by harry
Published on 2010-04-21T19:40:24Z Indexed on 2010/04/21 19:43 UTC
Read the original article Hit count: 261

There's various activation functions: sigmoid, tanh, etc. And there's also a few initializer functions: Nguyen and Widrow, random, normalized, constant, zero, etc. So do these have much effect on the outcome of a neural network specialising in face detection? Right now I'm using the Tanh activation function and just randomising all the weights from -0.5 to 0.5. I have no idea if this is the best approach though, and with 4 hours to train the network each time, I'd rather ask on here than experiment!

© Stack Overflow or respective owner

Related posts about artificial-neural-network

Related posts about face-detection