Home / LE v 1.3 - Demo
Name Modified Size InfoDownloads / Week
Parent folder
LE_v1.3.zip 2014-07-23 6.9 kB
readme.txt 2014-07-23 3.7 kB
Totals: 2 Items   10.6 kB 0
This program demonstrates the use of Learning Entropy [1] for novelty detection in time series where relatively simple real-time learning systems can instantly detect novelty in otherwise complex dynamical behavior [2-4].
------------------ 
This program and the Python code is free for non-commercial use with no warranty.
------------------
Program hints:
 For the beginning, click the button "RUN setup scripts + LE" that imports a demo of sinus time series and the setups and it runs the algorithm of AISLE (Approximate Individual Sample Learning Entropy) [1].
 Then you can modify the Python scripts, which you see on the right, so you can create or import your own time series, but it is necessary to apply your time series changes by the button "<== run script for your time series y", or you can customize the setups on the right.  
------------------
Learning Entropy tuning hints:
LE is a cognitive method, and such as, it is relative to the quality of a learning system and to its pre-training for given data.
 If AISLE does not perform as you expect, the pre-training is likely to be improved first. Try to increase the number of pre-training epochs (10x, 100x, 1000x) or try to extend the pre-training interval by Ntrain. 
Also, you can try to reconfigure the predictor input by increasing the input vector length n (or by increasing the prediction horizon p).
Then, you can try nonlinear predictors instead linear ones, i.e., try Quadratic Neural Unit (QNU) or a single-hidden layer MLP.
 The false detections can be suppressed by including higher orders of Learning Entropy (OLEs); so far up to OLE=4 was investigated.
 For more or less sensitive detection, you may customize vector "alphas" by including smaller or larger values.
 If prediction is unstable, try normalized Gradient Descent, then you can try to decrease learning rate mu.
 This program does not have the ranged setup for weight history as it was done for quasi-periodical signals in [1], Eq.(27), p.4179 <=> in this program, this can be analogically achieved by modification of the prediction horizon p.
------------------
Keep combining setups and playing with the parameters of the method for various data with various perturbations. 
We hope this program can help you to understand Learning Entropy and its potentials. 
Thank you for your interest in Learning Entropy and for trying this application.
------------------
Contact: http://aspicc.fs.cvut.cz/
------------------ 
Created using Python 2.6, Numpy,  Matplotlib, wxGlade 0.6.5,easygui
------------------ 
References:
[1] Bukovsky, I. Learning Entropy: Multiscale Measure for Incremental Learning. Entropy 2013, 15, 4159-4187.
(Full text pdf:       http://www.mdpi.com/1099-4300/15/10/4159).
-
[2] Bukovsky, I., Kinsner, W., Bila, J.: „Multiscale Analysis Approach for Novelty Detection in Adaptation Plot“, 3rd Sensor Signal Processing for Defense 2012 (SSPD 2012), Imperial College London, UK, September 24-27, 2012, doi: 10.1049/ic.2012.0114, E-ISBN: 978-1-84919-712-0.
-
[3] Bukovsky, I., Bila. J: “Adaptive Evaluation of Complex Dynamic Systems using Low-Dimensional Neural Architectures”, in Advances in Cognitive Informatics and Cognitive Computing, Series: Studies in Computational Intelligence, Vol. 323/2010, eds. D. Zhang, Y. Wang, W. Kinsner, Springer-Verlag Berlin Heidelberg, 2010, ISBN: 978-3-642-16082-0, pp.33-57.
-
[4] Bukovsky, I. : Modeling of Complex Dynamic Systems by Nonconventional Artificial Neural Architectures and Adaptive Approach to Evaluation of Chaotic Time Series, Ph.D. THESIS, Faculty of Mechanical Engineering, Czech Technical University in Prague, 2007.
------------------ 
------------------
Source: readme.txt, updated 2014-07-23