Monday, February 25, 2019

Impact of scaling and shifting random variables

To make training the network easier, we standardize each of the continuous variables. That is, we'll shift and scale the variables such that they have zero mean and a standard deviation of 1.
The scaling factors are saved so we can go backwards when we use the network for predictions.

If we have one random variable, that is constructed by adding a constant to another random variable
  • We would shift the mean by that constant
  • It would not shift the standard deviation
If we try to scale a random variable by multiplying a constant then we get a random variable
  • It would effect both standard deviation and mean

No comments:

Post a Comment