Showing posts with label Data Science. Show all posts
Showing posts with label Data Science. Show all posts

Wednesday, January 29, 2020

Error functions




  • In most learning networks, error is calculated as the difference between the actual output and the predicted output.
  • The error function is which tells us how far are we from the solution.
  • The function that is used to compute this error is known as loss function.
  • Different loss functions will give different errors for the same prediction and thus would have a considerable effort on the performance of the model.

Thursday, January 23, 2020

Transfer learning




TRANSFER LEARNING Using a pre trained network on images not in training set is known as transfer learning

Monday, October 28, 2019

Monday, February 25, 2019

Impact of scaling and shifting random variables


To make training the network easier, we standardize each of the continuous variables. That is, we'll shift and scale the variables such that they have zero mean and a standard deviation of 1.
The scaling factors are saved so we can go backwards when we use the network for predictions.

SHIFTING
If we have one random variable, that is constructed by adding a constant to another random variable
  • We would shift the mean by that constant
  • It would not shift the standard deviation

Categorical Variables


  • These are variables that fall into a category
  • There is no order for categorical variables
  • They are not quantitative variables

Thursday, January 11, 2018

Quick review of machine learning algorithms

These are some of the important machine learning algorithms

Decision tree

  •  Belongs to the family of supervised learning algorithms. 
  • Can be used for solving regression and classification problems too.The general motive of using
  • Decision Tree is to create a training model which can use to predict class or value of target variables by learning decision rules inferred from prior data(training data)
       Ex : Banker deciding whether to grant a loan.

Labels