Incremental Unsupervised Time Series Analysis Using Merge Growing Neural Gas (bibtex)
by A Andreakis, Nvon Hoyningen-Huene and M Beetz
Abstract:
We propose Merge Growing Neural Gas (MGNG) as a novel unsupervised growing neural network for time series analysis. MGNG combines the state-of-the-art recursive temporal context of Merge Neural Gas (MNG) with the incremental Growing Neural Gas (GNG) and enables thereby the analysis of unbounded and possibly infinite time series in an online manner. There is no need to define the number of neurons a priori and only constant parameters are used. In order to focus on frequent sequence patterns an entropy maximization strategy is utilized which controls the creation of new neurons. Experimental results demonstrate reduced time complexity compared to MNG while retaining similar accuracy in time series representation.
Reference:
Incremental Unsupervised Time Series Analysis Using Merge Growing Neural Gas (A Andreakis, Nvon Hoyningen-Huene and M Beetz), In WSOM (JC Príncipe, R Miikkulainen, eds.), Springer, volume 5629, 2009. 
Bibtex Entry:
@inproceedings{andreakis_incremental_2009,
 author = {A Andreakis and Nvon Hoyningen-Huene and M Beetz},
 title = {Incremental Unsupervised Time Series Analysis Using Merge Growing
	Neural Gas},
 booktitle = {{WSOM}},
 year = {2009},
 editor = {Príncipe, José Carlos and Miikkulainen, Risto},
 volume = {5629},
 series = {Lecture Notes in Computer Science},
 pages = {10--18},
 publisher = {Springer},
 abstract = {We propose Merge Growing Neural Gas ({MGNG)} as a novel unsupervised
	growing neural network for time series analysis. {MGNG} combines
	the state-of-the-art recursive temporal context of Merge Neural Gas
	({MNG)} with the incremental Growing Neural Gas ({GNG)} and enables
	thereby the analysis of unbounded and possibly infinite time series
	in an online manner. There is no need to define the number of neurons
	a priori and only constant parameters are used. In order to focus
	on frequent sequence patterns an entropy maximization strategy is
	utilized which controls the creation of new neurons. Experimental
	results demonstrate reduced time complexity compared to {MNG} while
	retaining similar accuracy in time series representation.},
 isbn = {978-3-642-02396-5},
}
Powered by bibtexbrowser
Incremental Unsupervised Time Series Analysis Using Merge Growing Neural Gas (bibtex)
Incremental Unsupervised Time Series Analysis Using Merge Growing Neural Gas (bibtex)
by A Andreakis, Nvon Hoyningen-Huene and M Beetz
Abstract:
We propose Merge Growing Neural Gas (MGNG) as a novel unsupervised growing neural network for time series analysis. MGNG combines the state-of-the-art recursive temporal context of Merge Neural Gas (MNG) with the incremental Growing Neural Gas (GNG) and enables thereby the analysis of unbounded and possibly infinite time series in an online manner. There is no need to define the number of neurons a priori and only constant parameters are used. In order to focus on frequent sequence patterns an entropy maximization strategy is utilized which controls the creation of new neurons. Experimental results demonstrate reduced time complexity compared to MNG while retaining similar accuracy in time series representation.
Reference:
Incremental Unsupervised Time Series Analysis Using Merge Growing Neural Gas (A Andreakis, Nvon Hoyningen-Huene and M Beetz), In WSOM (JC Príncipe, R Miikkulainen, eds.), Springer, volume 5629, 2009. 
Bibtex Entry:
@inproceedings{andreakis_incremental_2009,
 author = {A Andreakis and Nvon Hoyningen-Huene and M Beetz},
 title = {Incremental Unsupervised Time Series Analysis Using Merge Growing
	Neural Gas},
 booktitle = {{WSOM}},
 year = {2009},
 editor = {Príncipe, José Carlos and Miikkulainen, Risto},
 volume = {5629},
 series = {Lecture Notes in Computer Science},
 pages = {10--18},
 publisher = {Springer},
 abstract = {We propose Merge Growing Neural Gas ({MGNG)} as a novel unsupervised
	growing neural network for time series analysis. {MGNG} combines
	the state-of-the-art recursive temporal context of Merge Neural Gas
	({MNG)} with the incremental Growing Neural Gas ({GNG)} and enables
	thereby the analysis of unbounded and possibly infinite time series
	in an online manner. There is no need to define the number of neurons
	a priori and only constant parameters are used. In order to focus
	on frequent sequence patterns an entropy maximization strategy is
	utilized which controls the creation of new neurons. Experimental
	results demonstrate reduced time complexity compared to {MNG} while
	retaining similar accuracy in time series representation.},
 isbn = {978-3-642-02396-5},
}
Powered by bibtexbrowser

Publications