Name : perl-Statistics-LTU
| |
Version : 2.8
| Vendor : obs://build_opensuse_org/devel:languages:perl
|
Release : lp154.6.1
| Date : 2023-01-27 17:46:34
|
Group : Development/Libraries/Perl
| Source RPM : perl-Statistics-LTU-2.8-lp154.6.1.src.rpm
|
Size : 0.06 MB
| |
Packager : https://www_suse_com/
| |
Summary : An implementation of Linear Threshold Units
|
Description :
Statistics::LTU defines methods for creating, destroying, training and testing Linear Threshold Units. A linear threshold unit is a 1-layer neural network, also called a perceptron. LTU\'s are used to learn classifications from examples.
An LTU learns to distinguish between two classes based on the data given to it. After training on a number of examples, the LTU can then be used to classify new (unseen) examples. Technically, LTU\'s learn to distinguish two classes by fitting a hyperplane between examples; if the examples have n features, the hyperplane will have n dimensions. In general, the LTU\'s weights will converge to a define the separating hyperplane.
The LTU.pm file defines an uninstantiable base class, LTU, and four other instantiable classes built on top of LTU. The four individual classes differs in the training rules used:
* ACR - Absolute Correction Rule
* TACR - Thermal Absolute Correction Rule (thermal annealing)
* LMS - Least Mean Squares rule
* RLS - Recursive Least Squares rule
Each of these training rules behaves somewhat differently. Exact details of how these work are beyond the scope of this document; see the additional documentation file (_ltu.doc_) for discussion.
|
RPM found in directory: /packages/linux-pbone/ftp5.gwdg.de/pub/opensuse/repositories/devel:/languages:/perl:/CPAN-S/15.4/noarch |