Added some text to this effect. Basically, the internals and interface
were designed to allow a completely different way of working with the
network that wouldn't be too useful for people using the many training
models, but that is useful for exploring random mutation and fitness
functions.
AI::ANN is an artificial neural network simulator. It differs from
existing
solutions in that it fully exposes the internal variables and allows -
and
forces - the user to fully customize the topology and specifics of the
produced neural network. If you want a simple solution, you do not want
this
module. This module was specifically written to be used for a simulation
of
evolution in neural networks, not training. The traditional 'backprop'
and
similar training methods are not (currently) implemented. Rather, we
make it
easy for a user to specify the precise layout of their network
(including both
topology and weights, as well as many parameters), and to then retrieve
those
details. The purpose of this is to allow an additional module to then
tweak
these values by a means that models evolution by natural selection. The
canonical way to do this is the included AI::ANN::Evolver, which allows
the addition of random mutations to individual networks, and the
crossing of
two networks. You will also, depending on your application, need a
fitness
function of some sort, in order to determine which networks to allow to
propagate. Here is an example of that system.
use AI::ANN;
my $network = new AI::ANN ( input_count => $inputcount, data =>
\@neuron_definition );
my $outputs = $network->execute( \@inputs ); # Basic network use
use AI::ANN::Evolver;
my $handofgod = new AI::ANN::Evolver (); # See that module for calling
details
my $network2 = $handofgod->mutate($network); # Random mutations
# Test an entire 'generation' of networks, and let $network and
$network2 be
# among those with the highest fitness function in the generation.
my $network3 = $handofgod->crossover($network, $network2);
# Perhaps mutate() each network either before or after the crossover to
# introduce variety.
We elected to do this with a new module rather than by extending an
existing
module because of the extensive differences in the internal structure
and the
interface that were necessary to accomplish these goals.