股价时间序列滑动窗口的流形学习实证研究
详细信息    查看官网全文
摘要
运用流形学习中的Isomap算法,对股价时间序列的滑动窗口进行流形模式的发现。对纳斯达克指数、标准普尔指数和道琼斯指数三支股票指数近6年的实际收盘价时间序列数据,设定了五种不同长度的滑动窗口,对其进行流形学习。实验发现股票时间序列在不同长度的滑动窗口下都是存在于某一个流形空间中的;并通过创造性地构造最小残差区间得到最优维数,发现了不同的股票拥有不同的嵌入维数,而最优的嵌入维数就是股价时间序列的本质维数。
In order to explore the manifold pattern of sliding window existing in the stock price time series,we use the Isomap algorithm of manifold learning on stock price time series.In the empirical study,we select the closing price data of the most famous stock indices,Nasdaq,S&P 500 and Dow in the last four years.After conducting the experiments on various lengths of sliding windows,we find out that the stock price time series with all kinds of sliding window lengths lie in a space of manifold,and the optimal embedding dimension of the manifold space is various with different stocks.The optimal embedding dimension is defined as the intrinsic dimension of some stock.
引文
[1]徐梅,黄超.基于符号时间序列方法的金融收益分析与预测[J].中国管理科学,2011,19(5):1-9.
    [2]许林,宋光辉,郭文伟.基于滑动窗口MF-DFA的股票风格资产收益多重分形分析[J].系统工程理论与实践,2012,32(9):1891-1899.
    [3]李爱国,覃征.滑动窗口二次自回归模型预测非线性时间序列[J].计算机学报,2004,27(7):1004-1008.
    [4]张军,马志民.基于时间序列的相似子模式发现算法[J].计算机技术与发展,2006,16(1):140-142.
    [5]Huang Yan,Kou Gang.A kernel entropy manifold learning approach for financial data analysis[J].Decision Support Systems,2014,64(8):31-42.
    [6]杨健,蔡红宇.中国股市实证技术分析指南[M].北京:中国人民大学出版社,1999.
    [7]Donoho D L.High-dimensional data analysis:The curses and blessings of dimensionality[J].Lectare-Math challenges of Century,2000,13(13):178-183.
    [8]Fukunaga K.Introduction to statistical pattern recognition[M].Elsevier Ltd.:Academic press,1990.
    [9]Huber B P.Projection pursuit.Annals of Statistics,1985,13(2):435-475.
    [10]马天.流形拓扑学——理论与概念的实质[M].北京科学出版社,2010.
    [11]Tenenbaum J B,De S V,Langford J C.A global geometric framework for nonlinear dimensionality reduction[J].Science,2000,290(5500):2319-2323.
    [12]Roweis S T,Saul L K.Nonlinear dimensionality reduction by locally linear embedding[J].Science,2000,290(5500):2323-2326.
    [13]Belkin M,Niyogi P.Laplacian eigenmaps for dimensionality reduction and data representation[J].Neural computation,2003,15(6):1373-1396.
    [14]Zhang Zhenyue,Zha Hongyuan.Principal manifolds and nonlinear dimensionality reduction via tangent space alignment[J].Journal of Shanghai University(English Edition),2004,8(4):406-424.
    [15]Donoho D L,Grimes C.Hessian eigenmaps:Locally linear embedding techniques for high-dimensional data[J].Proceedings of the National Academy of Sciences,2003,100(10):5591-5596.
    [16]Weinberger K Q,Saul L K.An introduction to nonlinear dimensionality reduction by maximum variance unfolding[C]//proceedings of National conference on Artificial Intelligence.AAAI.2006,6:1683-1686.
    [17]Roweis S T,Saul L K.Nonlinear dimensionality reduction by locally linear embedding[J].Science,2000,290(5500):2323-2326.
    [18]Belkin M,Niyogi P.Laplacian eigenmaps and spectral techniques for embedding and clustering[J].Advances in Neural Information Processing systems,2002,14(6):585-591.
    [19]He Xiaofei,Yan Shuicheng,Hu Yuxiao,et al.Learning a locality preserving subspace for visual recognition[C]//Proceedings of the Ninth IEEE International Conference on computer vision,Washington,2003:385-392.
    [20]Borg I,Groenen P J F.Modern multidimensional scaling:Theory and applications[M].Springer,2005.
    [21]CoxT F,CoxM A A.Multidimensional Scaling[M].London Chapman&Hall,1994
    [22]Balasubramanian M,Schwartz E.L,Tenenbaum J.B,et al.The ISOMAP algorithm and topological stability.Science,2002,295(5552):7

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700