DADiSP / NeuralNet

Neural Network Module

neural networkDADiSP NeuralNet is an add-on module to DADiSP that provides direct and easy access to the demonstrated predictive power and pattern recognition capability of neural networking technology. With DADiSP/NeuralNet users can build their own artificial neural networks (ANNs) and apply them to achieve more accurate predictions and pattern classifications.
 

 

 

Key features

  • Menu-driven Network design.
  • Automatic normalisation of data.
  • Choice of the number of hidden layers.
  • Unlimited input and output variables.
  • Unlimited rumber of runs.
  • Cross-validation training to verify output results simultaneously.
  • Built-in protection against local minima distorting output results.
  • User selectable desired mean square error, minimum gradient norm and desired absolute error.
  • Digital Error, Analogue Error, Maximum Error and Gradient Values Post-Training Error Graph Types.
  • Extract Random Seeds gives the values used to start a network and enables building another network with the same weights.
  • Extract network weights returns the weights and biases that define the network.

Neural network training

Neural networks resemble the human brain because they can learn. A back-propagation neural network develops its predictive capabilities by being trained on a set of historical inputs and known resulting outputs. The neural net applies random weights to each designated input variable. It then adjusts the weights depending on how closely the actual output values match the desired output values in the training set of historical data. Once the appropriate variable weights have been set that minimise the difference between expected and actual output from the neural net, the neural net can then be applied to new data for classification.

Back-propagation learning algorithm

DADiSP/NeuralNet employs the back-propagation learning algorithm. Back-propagation has become the most widely used neural network paradigm for modelling, forecasting, and classification. To minimise the error in the network DADiSP/NeuralNet uses a rapid-descent algorithm derived from the Vogl method of locating the global minimum. Since results depend on the initial conditions the neural net module allows you to train a lot of neural networks on the same data with different initial configurations and pick the best one.

Powerful preprocessing functions

Preprocessing of the data is one of the largest problems in using neural network tools. DADiSP/NeuralNet is fully integrated with DADiSP so hundreds of analysis functions are available to pre- and post-process neural network data. DADiSP has mathematical and statistical functions to scale, filter and process the data to identify features to be learned by the neural network. A typical Worksheet will contain the preprocessing steps, the neural network and the output results. Simply change the input data or initial conditions and each dependent Window is automatically recalculated. You immediately see the effects of your changes on the neural network.

NeuralNet functions

DADiSP/NeuralNet includes several functions to create, apply and analyse neural networks.

applynet Apply a neural network to data.
getrundata Extract a particular set of data for a run.
getseeds Extract neural network random seed values.
getweights Extract neural network weights.
makenet Create a neural network.
normalise Normalise target data to +-1 range.

The Next Steps

What do our Customers say about us?

The Origin software has excellent statistics and is VERY user friendly.

Habibeh Khoshbouei, Pharm. D., Ph.D., Meharry Medical College

Tutor was very knowledgeable and taught in a way that was easy to follow

KH, Uxbridge, UK

Very helpful and approachable – excellent trainer

IG, Manchester, UK

Thorough knowledge of product and able to answer queries arising during the session. Good pacing of content.

For the time being we are unable to offer the following product ranges although we are currently working hard to increase the number of products we can offer in the future. Please contact us to talk about alternative products that we may be able to offer you.