learn unix and linux commands

Article: Neural Networks on the NetBeans Platform

 
Thread Tools Search this Thread
# 1  
Old 03-02-2011
Article: Neural Networks on the NetBeans Platform

Neuroph Studio is an open source, Java neural network development environment built on top of the NetBeans Platform. This article shows how to create Java neural networks for classification.

More...
Login or Register to Ask a Question

Previous Thread | Next Thread

2 More Discussions You Might Find Interesting

1. Fedora

Installing netbeans

How can I install netbeans on Solaris. Is there a package manager like yum for Fedora. I have tried pkg install netbeans, doesnt find it. Already preformed pkg install slim_install and it installed the jre. However if I run the sh file downloaded for netbeans it says no compatible JDK found. Now... (2 Replies)
Discussion started by: Fingerz
2 Replies

2. HP-UX

netbeans IDE on Hp-UX

Can netbeans IDE be installed and used on hp-ux? Sorry I know this is more java specific but does anybody have any experience with this? There does not seem to be any specific installation or support from netbeans. (0 Replies)
Discussion started by: domestos
0 Replies
Login or Register to Ask a Question
FANN_GET_ACTIVATION_STEEPNESS(3)					 1					  FANN_GET_ACTIVATION_STEEPNESS(3)

fann_get_activation_steepness - Returns the activation steepness for supplied neuron and layer number

SYNOPSIS
float fann_get_activation_steepness (resource $ann, int $layer, int $neuron) DESCRIPTION
Get the activation steepness for neuron number neuron in layer number layer, counting the input layer as layer 0. It is not possible to get activation steepness for the neurons in the input layer. The steepness of an activation function says something about how fast the activation function goes from the minimum to the maximum. A high value for the activation function will also give a more agressive training. When training neural networks where the output values should be at the extremes (usually 0 and 1, depending on the activation function), a steep activation function can be used (e.g. 1.0). The default activation steepness is 0.5. PARAMETERS
o $ann -Neural network resource. o $layer - Layer number o $neuron - Neuron number RETURN VALUES
The activation steepness for the neuron or -1 if the neuron is not defined in the neural network, or FALSE on error. SEE ALSO
fann_set_activation_function(3), fann_set_activation_steepness_layer(3), fann_set_activation_steepness_hidden(3), fann_set_activa- tion_steepness_output(3), fann_set_activation_steepness(3). PHP Documentation Group FANN_GET_ACTIVATION_STEEPNESS(3)