neuralnetworks.core

error

(error instance input expected-output)

Calculate the error between the expected output node and the predicted nodes

new-instance

(new-instance input thetas output problem-type)(new-instance input thetas output problem-type options)

Creates new instance of neural networks.

Problem-type accepts either :classification or :regression. Problem-type determines the default sigmoid and error function

For classification it will use

{:sigmoid-fn standard-logistic
 :error-fn cross-entropy}

for the options. Cross-entropy is more suitable because it penalizes misclassification

Otherwise, for :regression it will use

{:sigmoid-fn hyperbolic-tangent
 :error-fn mean-squared-error}

Mean squared error is best suited for regression (curve-fitting) problem

Options will be a hash map of

{:regularization-rate value
 :activation-fn function
 :sigmoid-fn function       ; optional, if you want to customize/override sigmoid function
 :error-fn function         ; optional, if you want to customize/override error function
 :optimizer optimizer}

Thetas would be the vector of initial weights matrices between each layer. To create a single hidden layer, Thetas would be a vector of two weight matrices.

If optimizer is not specified, by default it will use gradient descent optimizer with the following settings:

  • initial learning rate of 4
  • learning rate update of 0.5

Note it is important to always normalize the input and output nodes for better performance

Returns a hashmap

{
  :input input-matrix
  :output output-matrix
  :regularization-rate value ; default is 0
  :sigmoid-fn function       ; default is standard logistic for classification, hyperbolic
                             ; tangent for regression
  :errror-fn function        ; default is cross-entropy for classification, mean squared error
                             ; for regression
  :optimizer function        ; default is gradient-descent with the default options
  :states {
            :thetas [theta-matrix-1, theta-matrix-2, ...]
            :iteration (atom 0)
            :error (atom nil)
            :training-durations 1 ; in ms
          }
}

predict

(predict instance input)

Predict the output given the neural networks settings

randomize-thetas

(randomize-thetas input-nodes hidden-layers-nodes output-nodes)

Create a randomize thetas for initial values

It will the following formula

randomize(L0, L1) * 2 * epsilon - epsilon

Where L0 and L1 are the number of nodes adjacent to theta (e.g. input-node and hidden-layer-1-nodes, hidden-layer-1-nodes and output-nodes)

Epsilon will be calculated using the following formula

sqrt(6) / sqrt(L0 + L1)

hidden-layers-nodes will be a vector of integers (number of nodes per hidden layer)

train!

(train! instance stopping-conditions)

Train the neural networks. This will update the thetas/weights

Stopping conditions is a vector of stopping condition functions used by the optimizer which in turn used by neural networks training function.

If multiple stopping conditions are provided, it will be treated as OR meaning as long as one of the condition is satisfied, training will be stopped (i.e. optimizer is finished)