Home
People
Projects
Research
topics
Publications
Contact |
Hybrid Model Learner (HML)
A special architecture for model learning.
This new discovery* combines linear and nonlinear operations and utilizes efficient statistical methods to realize noise tolerant,
unsupervised pattern discovery and learning by modeling regularities in sequences.
The learned pattern models are related to context-free grammars.
* The discovery is based on the work performed during 2002-2009 by professor Unto K. Laine. Patent is pending (FI-20095708)
HML is hybrid in many dimensions. |
|
HML has many different functionalities integrated |
|
HML general architecture |
|
Noise or Pattern?
Based on its internal statistical models and methods HML can separate noise from perceptually
relevant, structural information based solely on the information provided by the input sequence under study.
The only prior, very general knowledge how to separate patterns from noise, is given by the
internal statistical models and methods of the HML.
|
D = input sequence
M = model for D
G = generator/extractor
D' = residual sequence
|
1° The statistical properties of the input sequence D are analyzed in order to create models M(D) for it.
2° The models are used to remove the structures found from the input sequence (extractor G).
3° A compressed residual sequence D' is created.
HML experiments and demonstrations
Example 1: Compression of binary sequences
- HML performs about 30% better than GZIP
Example 2: Classification of pseudorandom sequences
- Sequences generated by standard pseudorandom generators can be recognized and separated from the natural random sequences.
- Weak, hidden patterns in the pseudorandom sequences can be detected and learned by the HML.
Example 3: A nonlinear, high-resolution HML filter
- dfdt ~ = 0.1 (classical limit in linear case = 0.5).
- The obtained result is close to that measured from human auditory system.
Example 4: Compression of 10 000 first digits of pi
- HML expands the sequence less than any other known lossless compression method.
Example 5: Patterns in genomic sequences
- HML is able to find important subsequences without any external help entirely based on its internal mechanisms.
Example 6: Word recognition in continuous speech
- HML is able to learn word models just by exposing it to sentences having the words (supervised learning).
- After the learning phase, HML is ready to recognize similar words in a continuous speech.
Publications on the theory and simulations of HML are under preparation (6/09)
|
Compression of binary sequences where the probability
of 1 is q and that of 0 is (1-q). HML tested with two different preprocessing. |
A Chinese proverb:
Men who say it cannot be done, should not interrupt those doing it.
|