Commit 3ab0175f authored by Chris Coughlin's avatar Chris Coughlin

Added gradient machine to list of available algorithms

parent cfc24222
......@@ -95,6 +95,7 @@ Myriad uses [Smile]( and [Apache Mahout](http:/
1. **Stochastic Gradient Descent (SGD)** - also known as steepest descent. A stochastic approximation to gradient descent, in which the direction of the local gradient is found and steps are taken in the negative of this direction.
2. **Adaptive Stochastic Gradient Descent (ASGD)** - an “ensemble” learning method in which multiple SGD models are trained and the “best” are used for predictions.
3. **Passive Aggressive (PA)** - for a weight vector W initialized with 0 in each element, calculate the loss at each step L = max(0, 1 - yd<sub>T</sub>W) where y is the actual category of the sample d and d<sub>T</sub> is the transpose of d. Update the weight vector as Wnew = w + yLd and repeat.
4. **Gradient Machine** - a gradient machine learner with one hidden sigmoid layer that attempts to minimize hinge loss. Currently in development and should be considered experimental.
Each algorithm has its strengths and weaknesses; Emphysic recommends that each be evaluated during an initial experiment. Note that for SGD in particular it is possible that Myriad Trainer will report that no useful results were returned; this is an indication that the model was unable to learn any difference between the positive and negative samples. If you are using a train and a test set, additional attempts may address the issue.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment