• Jens Bürger Universidad Privada Boliviana
  • Jorge Calvimontes



Palabras clave:

Dictionary Learning, SAILnet, Time-Series, Decomposition


Dictionary Learning (DL) is a feature learning method that derives a finite collection of dictionary elements (atoms) from a given dataset. These atoms are small characteristic features representing recurring patterns within the data. A dictionary therefore is a compact representation of complex or large scale datasets. In this paper we investigate DL for temporal signal decomposition and reconstruction. Decomposition is a common method in time-series forecasting to separate a complex composite signal into different frequency components as to reduce forecasting complexity. By representing characteristic features, we consider dictionary elements to function as filters for the decomposition of temporal signals. Rather than simple filters with clearly defined frequency spectra, we hypothesize for dictionaries and the corresponding reconstructions to act as more complex filters. Training different dictionaries then permits to decompose the original signal into different components. This makes it a potential alternative to existing decomposition methods. We apply a known sparse DL algorithm to a wind speed dataset and investigate decomposition quality and filtering characteristics. Reconstruction accuracy serves as a proxy for evaluating the dictionary quality and a coherence analysis is performed to analyze how different dictionary configurations lead to different filtering characteristics. The results of the presented work demonstrate how learned features of different dictionaries represent transfer functions corresponding to frequency components found in the original data. Based on finite sets of atoms, dictionaries provide a deterministic mechanism to decompose a signal into various reconstructions and their respective remainders. These insights have direct application to the investigation and development of advanced signal decomposition and forecasting techniques.


Los datos de descargas todavía no están disponibles.

Afiliación del autor/a

Jens Bürger, Universidad Privada Boliviana

Institute for Computational Intelligence (ICI)


Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, p. 436, 2015.

M. S. Lewicki and T. J. Sejnowski, “Learning Overcomplete Representations,” Neural Computation, vol. 12, no. 2, pp. 337–365, 2000.

R. J. Hyndman and G. Athanasopoulos, “Forecasting: Principles and practice.” OTexts, 2018.

J. Mairal, F. Bach, J. Ponce, and G. Sapiro, “Online Dictionary Learning for Sparse Coding,” in Proceedings of the 26th Annual International Conference on Machine Learning, ACM, 2009, pp. 689– 696.

J. Fritz, S. Shamma, M. Elhilali, and D. Klein, “Rapid task-related plasticity of spectrotemporal receptive fields in primary auditory cortex,” Nature Neuroscience, vol. 6, no. 11, p. 1216, 2003.

Q. Barthélemy, C. Gouy-Pailler, Y. Isaac, A. Souloumiac, A. Larue, and J. I. Mars, “Multivariate Temporal Dictionary Learning for EEG,” Journal of Neuroscience Methods, vol. 215, no. 1, pp. 19–28, 2013.

J. Zylberberg, J. T. Murphy, and M. R. DeWeese, “A Sparse Coding Model with Synaptically Local Plasticity and Spiking Neurons Can Account for the Diverse Shapes of V1 Simple Cell Receptive Fields,” PLoS Computational Biology, vol. 7, no. 10, e1002250, 2011.

S. Saha, S. Moorthi, X. Wu, J. Wang, S. Nadiga, P. Tripp, D. Behringer, Y.-T. Hou, H.-y. Chuang, M. Iredell, et al., “The NCEP Climate Forecast System Version 2,” Journal of Climate, vol. 27, no. 6, pp. 2185–2208, 2014.

S. Dasgupta, F. Wörgötter, and P. Manoonpong, “Information Theoretic Self-organised Adaptation in Reservoirs for Temporal Memory Tasks,” in International Conference on Engineering Applications of Neural Networks, Springer, 2012, pp. 31–40.

N. A. Lesica and B. Grothe, “Efficient Temporal Processing of Naturalistic Sounds,” PloS One, vol. 3, no.2, e1655, 2008.

K. Patil, D. Pressnitzer, S. Shamma, and M. Elhilali, “Music in Our Ears: The Biological Bases of Musical Timbre Perception,” PLoS Computational Biology, vol. 8, no. 11, e1002759, 2012.

P. Laurinec, M. Lóderer, M. Lucká, and V. Rozinajová, “Density-based unsupervised ensemble learning methods for time series forecasting of aggregated or clustered electricity consumption,” Journal of Intelligent Information Systems, pp. 1–21, 2019.

W. B. Levy, A. B. Hocking, and X. Wu, “Interpreting hippocampal function as recoding and forecasting,” Neural Networks, vol. 18, no. 9, pp. 1242–1264, 2005.

A. Longtin, “Nonlinear Forecasting of Spike Trains from Sensory Neurons,” International Journal of Bifurcation and Chaos, vol. 3, no. 03, pp. 651–661, 1993.



Cómo citar

Bürger, J., & Calvimontes, J. (2019). TEMPORAL DICTIONARY LEARNING FOR TIME-SERIES DECOMPOSITION. Revista Investigación &Amp; Desarrollo, 19(1). https://doi.org/10.23881/idupbo.019.1-7i