DADiSP Software
Module - Neural Network
DADiSP/NeuralNet
is an add-on module to DADiSP
that provides direct and easy access to the demonstrated predictive power
and pattern recognition capability of neural networking technology. With
DADiSP/NeuralNet, users can build their own artificial neural networks
(ANNs) and apply them to achieve more accurate predictions and pattern
classifications.
Key Features
- Menu-driven Network Design
- Automatic Normalisation of Data
- Choice of the Number of Hidden Layers
- Unlimited Input and Output Variables
- Unlimited Number of Runs
- Cross-validation Training to Verify Output Results Simultaneously
- Built-in Protection Against Local Minima Distorting Output Results
- User Selectable Desired Mean Square Error, Minimum Gradient Norm and
Desired Absolute Error
- Digital Error, Analogue Error, Maximum Error and Gradient Values Post-Training
Error Graph Types
- Extract Random Seeds Gives the Values Used to Start a Network and
Enables Building another Network with the Same Weights
- Extract Network Weights Returns the Weights and Biases that Define
the Network
Neural Network Training
Neural networks resemble the human brain because they can learn. A back-propagation
neural network develops its predictive capabilities by being trained on
a set of historical inputs and known resulting outputs. The neural net
applies random weights to each designated input variable. It then adjusts
the weights depending on how closely the actual output values match the
desired output values in the training set of historical data. Once the
appropriate variable weights have been set that minimise the difference
between expected and actual output from the neural net, the neural net
can then be applied to new data for classification.
Back-propagation Learning Algorithm
DADiSP/NeuralNet employs the back-propagation learning algorithm. Back-propagation
has become the most widely used neural network paradigm for modelling,
forecasting, and classification. To minimise the error in the network,
DADiSP/NeuralNet uses a rapid-descent algorithm derived from the Vogl
method of locating the global minimum. Since results depend on the initial
conditions, the neural net module allows you to train a lot of neural
networks on the same data with different initial configurations and pick
the best one.
Powerful Preprocessing Functions
Preprocessing of the data is one of the largest problems in using neural
network tools. DADiSP/NeuralNet is fully integrated with DADiSP, so hundreds
of analysis functions are available to pre- and post-process neural network
data. DADiSP has mathematical and statistical functions to scale, filter,
and process the data to identify features to be learned by the neural
network. A typical Worksheet will contain the preprocessing steps, the
neural network and the output results. Simply change the input data or
initial conditions and each dependent Window is automatically recalculated.
You immediately see the effects of your changes on the neural network.
NeuralNet Functions
DADiSP/NeuralNet includes several functions to create, apply and analyse
neural networks.
| applynet |
Apply a neural network to data |
| getrundata |
Extract a particular set of data for a run |
| getseeds |
Extract neural network random seed values |
| getweights |
Extract neural network weights |
| makenet |
Create a neural network |
| normalise |
Normalise target data to +-1 range |
|