one to one coaching
Next price predictor using Neural Network

Next price predictor using Neural Network

No permission to download
Autor:

gpwr

Version History:

06/26/2009 - added a new indicator BPNN Predictor with Smoothing.mq4, in which prices are smoothed using EMA before predictions.

08/20/2009 - corrected the code calculating the neuron activation function to prevent arithmetic exception; updated BPNN.cpp and BPNN.dll

08/21/2009 - added clearing of memory at the end of the DLL execution; updated BPNN.cpp and BPNN.dll

Brief theory of Neural Networks:

Neural network is an adjustable model of outputs as functions of inputs. It consists of several layers:

  • input layer, which consists of input data
  • hidden layer, which consists of processing nodes called neurons
  • output layer, which consists of one or several neurons, whose outputs are the network outputs.
All nodes of adjacent layers are interconnected. These connections are called synapses. Every synapse has an assigned scaling coefficient, by which the data propagated through the synapse is multiplied. These scaling coefficient are called weights (w[j][k]). In a Feed-Forward Neural Network (FFNN) the data is propagated from inputs to the outputs. Here is an example of FFNN with one input layer, one output layer and two hidden layers:

NN1__1.gif


The topology of a FFNN is often abbreviated as follows: <# of inputs> - <# of neurons in the first hidden layer> - <# of neurons in the second hidden layer> -...- <# of outputs>. The above network can be referred to as a 4-3-3-1 network.

The data is processed by neurons in two steps, correspondingly shown within the circle by a summation sign and a step sign:

  1. All inputs are multiplied by the associated weights and summed
  2. The resulting sums are processed by the neuron's activation function, whose output is the neuron output.
It is the neuron's activation function that gives nonlinearity to the neural network model. Without it, there is no reason to have hidden layers, and the neural network becomes a linear autoregressive (AR) model.

Enclosed library files for NN functions allow selection between three activation functions:

  • sigmoid sigm(x)=1/(1+exp(-x)) (#0)
  • hyperbolic tangent tanh(x)=(1-exp(-2x))/(1+exp(-2x)) (#1)
  • rational function x/(1+|x|) (#2)
af__1.gif
Author
Salvador
Downloads
6
Views
38
First release
Last update
Rating
0.00 star(s) 0 ratings
Top