scaling inputs data to neural network

17,342

Firstly, there are many types of ANNs, I will assume you are talking about the simplest one - multilayer perceptron with backpropagation.

Secondly, in your question you are mixing up data scaling (normalization) and weight initialization.

You need to randomly initialize weights to avoid symmetry while learning (if all weights are initially the same, their update will also be the same). In general, concrete values don't matter, but too large values can cause slower convergence.

You are not required to normalize your data, but normalization can make learning process faster. See this question for more details.

Share:
17,342

Related videos on Youtube

James
Author by

James

Updated on June 04, 2022

Comments

  • James
    James about 2 years

    Do we have to scale input data for neural network? How does it affect the final solution of neural network?

    I've tried to find some reliable sources on that. The book "elements of statistical learning" (page 400) says it will help choosing reasonable initial random weights to start with.

    Aren't the final weights deterministic regardless of the initial random weights we use?

    Thank you.