DADiSP / NeuralNet
Neural Network Module
DADiSP NeuralNet is an add-on module to DADiSP that provides direct and easy access to the demonstrated predictive power and pattern recognition capability of neural networking technology. With DADiSP/NeuralNet users can build their own artificial neural networks (ANNs) and apply them to achieve more accurate predictions and pattern classifications.
- Menu-driven Network design.
- Automatic normalisation of data.
- Choice of the number of hidden layers.
- Unlimited input and output variables.
- Unlimited rumber of runs.
- Cross-validation training to verify output results simultaneously.
- Built-in protection against local minima distorting output results.
- User selectable desired mean square error, minimum gradient norm and desired absolute error.
- Digital Error, Analogue Error, Maximum Error and Gradient Values Post-Training Error Graph Types.
- Extract Random Seeds gives the values used to start a network and enables building another network with the same weights.
- Extract network weights returns the weights and biases that define the network.
Neural network training
Neural networks resemble the human brain because they can learn. A back-propagation neural network develops its predictive capabilities by being trained on a set of historical inputs and known resulting outputs. The neural net applies random weights to each designated input variable. It then adjusts the weights depending on how closely the actual output values match the desired output values in the training set of historical data. Once the appropriate variable weights have been set that minimise the difference between expected and actual output from the neural net, the neural net can then be applied to new data for classification.
Back-propagation learning algorithm
DADiSP/NeuralNet employs the back-propagation learning algorithm. Back-propagation has become the most widely used neural network paradigm for modelling, forecasting, and classification. To minimise the error in the network DADiSP/NeuralNet uses a rapid-descent algorithm derived from the Vogl method of locating the global minimum. Since results depend on the initial conditions the neural net module allows you to train a lot of neural networks on the same data with different initial configurations and pick the best one.
Powerful preprocessing functions
Preprocessing of the data is one of the largest problems in using neural network tools. DADiSP/NeuralNet is fully integrated with DADiSP so hundreds of analysis functions are available to pre- and post-process neural network data. DADiSP has mathematical and statistical functions to scale, filter and process the data to identify features to be learned by the neural network. A typical Worksheet will contain the preprocessing steps, the neural network and the output results. Simply change the input data or initial conditions and each dependent Window is automatically recalculated. You immediately see the effects of your changes on the neural network.
DADiSP/NeuralNet includes several functions to create, apply and analyse neural networks.
|applynet||Apply a neural network to data.|
|getrundata||Extract a particular set of data for a run.|
|getseeds||Extract neural network random seed values.|
|getweights||Extract neural network weights.|
|makenet||Create a neural network.|
|normalise||Normalise target data to +-1 range.|
The Next Steps
Ready To Buy?
What do our Customers say about us?
I have now followed your (very clear!) instructions and am happy to report that my problem is solved. Magic! You have made me a much, much happier person!CS, Reading, UK
Tutor was very knowledgeable and taught in a way that was easy to followKH, Uxbridge, UK
Both days were time and money well spentPA, Glenrothes, UK
THANK YOU – you are a star and have saved my thesis! Your instructions worked perfectly. If I am purchasing upgrades I will definitely go to Adept now I know what level support you can provide.CB, Plymouth, UK
For the time being we are unable to offer the following product ranges although we are currently working hard to increase the number of products we can offer in the future. Please contact us to talk about alternative products that we may be able to offer you.