非参和半参回归模型的稳健和截面推断
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
在识别响应变量和预测变量的回归结构问题中,非参和半参回归模型因其良好的灵活性和(或)较好的可解释能力已经得到了深入的研究和广泛的应用。半参模型中部分线性模型是为一类常用的模型,它既保持了非参数模型的灵活性同时有具有参数模型良好的可解释性,特别的它还有效的避免了纯非参回归的“维数灾难”问题(curse of dimcnsionality)。近年来,在实际的医疗数据分析中,协变调整模型和变量选择问题已成为热点问题,引起了人们的极大关注。然而,非参回归模型中,普通核估计方法对窗宽选择敏感并且收敛速度也不尽如人意;协变调整的部分线性模型未有研究;如Fan和Li(2004)指出的部分线性模型的变量选择问题也少有研究。本篇论文中我们就针对这些非参和半参回归模型的相关问题进行研究。具体的,本篇论文的基本思想如下。
     已有的研究成果表明非参回归函数的普通核估计量可以近似的表示为,从上面的表达式我们发现了一种新的回归关系,r(χ)可以看做(?)hj(x)对hj回归的截距项,因此我们可以重构线性回归模型并通过加权最小二乘法得到r(χ)的估计。新的估计量结构简单并且尽管不使用高阶核仍然具有较小的均方误差。结果如下,最优窗宽的阶数为O(n-1/9)。进而我们发现虽然采用的窗宽hj不是最优的,但在满足条件hj=O(n-α)且有1/10<α<1/5成立时,新估计量(?)(χ)仍具有比普通核估计量更小的均方误差。由此说明新估计量对窗宽选择稳健。此外,在一些正则条件下,我们还得到了新估计量的渐近正态性,因此,论文第二章中通过联合非参回归和参数回归提出的两步估计(三步估计)能够就窗宽选择和收敛速度的意义上改进非参数估计。更一般的,我们的方法可以推广到一般的非参估计以及非参数回归模型,例如我们还把此方法推广到了多元非参回归模型,可加模型。
     受Senturk and Muller (2005)提出的协变量调整回归(covariatc-adjusted regres-sion (CAR))问题和另一实际问题(在研究钙缺乏的问题中,需要研究钙吸收量和钙摄取量之间的关系,同时还要考虑体征指标(body mass index)和年龄因素的影响)的启发,在第三章我们介绍并深入研究了协变量调整部分线性模型(covariate-adjusted partially linear models (CAPLM)),其中真实的响应变量Y和预测向量X是观测不到的,我们只能观测到它们被乘子φ(U)和φr(U)污染以后的变量(?)和(?),同时还考虑了时间T的影响。虽然我们的模型看起来像是Senturk (2006)提出的协变调整变系数模型(covariate-adjusted varying coefficient models (CAVCM))的特例,但实际上CAPLM和CAVCM所处理数据的类型有着本质的不同。在某一固定观测时刻有来自多个个体的观测是Senturk (2006)第一步估计方法的关键,而我们所研究的数据在固定观测时刻则可能仅有一个观测。因此,两种模型的推断方法是不同的。如Cui et al (2008)指出,由此我们可以给出(?)(U)和φ(U)的非参估计,并近似恢复真实的不可观测的Y和X。接下来,用恢复的数据来替换不可观测的真实数据,通过截面最小二乘法可以给出参数β的估计。并且,在一些温和的条件下我们还得到了参数估计量的渐近正态性,细节可参看3.3节。此外,我们还给出了回归系数的置信域。
     随着科技的发展,人们获取和存储高维数据集(即变量的个数p相当或者远大于样本容量n)变得更加方便。变量选择在高维数据分析中发挥着至关重要的作用,Dantzig selector是线性和广义线性模型变量选择方法中的一种。在第四章我们将研究部分线性模型的Dantzig selector变量选择问题,它的定义如下,其中(?)和(?)分别为中心化的设计阵和中心化的响应观测矩阵。我们得到了Dantzig selector的大样本性质。即n趋于无穷,p固定时,在合适的条件下有(?),其中β0为优化问题的解。我们还注意到Dantzig selector并不一定是相合的。为了克服此不足,我们采用Dicker和Lin(手稿)提出的adaptive Dantzig selector变量选择方法.部分线性模型adaptive Dantzig selector定义为,
     进而,我们得到在合适的条件下部分线性模型adaptive Dantzig selector参数估计量具有oracle性质.即n趋于无穷,p固定时,在特定条件下有adaptive Dantzig selector估计量是模型相合的,并且有Adaptive Dantzig selector作为Dantzig selector的一般形式,它们都可以采用James et al. (2009)提出的DASSO算法来解决最优化问题。文章还讨论了调整参数和窗宽的选择方法。
     综上所述,本篇论文进一步研究了非参和半参回归模型的相关问题。首先,对非参回归模型,我们提出了一种稳健的纠偏估计方法,新的两步(三步)估计量对窗宽选择稳健,并且不用高阶核就具有比普通核估计更快的收敛速度,均方误差阶数为O(n-8/9)。其次,我们研究了协变量调整的部分线性模型,给出了模型的推断方法,并且得到了参数部分估计量的渐近正态性和置信域。最后我们研究了高维部分线性模型的变量选择和参数估计问题。当样本容量n趋于无穷,变量个数p固定时我们研究了Dantzig selector参数估计量的大样本性质,并得到了adaptive Dantzig selector参数估计量的oracle性质。
     模拟实验和实际数据的应用进一步阐释了文中介绍的各种方法。
Nonparametric and semiparametric regression models are well developed and pop-ularly used models for their flexibility and/or interpretability in identifying the regres-sion structure between the response variable and predictor variables. Among semi-parametric models, partially linear model is a class of commonly-used model which is flexed enough and well interpretable. It allows easier interpretation of the effect of each variable and may be preferred to a completely nonparametric regression because of the well-known "curse of dimensionality". Recently, in real medical data analysis covariate-adjusted model and variable selection problems are very popular and have received much attention. However, the common kernel methods are sensitive to the bandwidth and can not achieve a satisfactory convergence rate in nonparametric regres-sion setting, estimation for covariate-adjusted partially linear models lacks of studying and limited work has been done on variable selection for partially linear models as noted in Fan and Li (2004). In this thesis we will focus on these problems mentioned above which are related to nonparametric and semiparametric regression models. More specifically, the motivation and the basic ideas of this thesis are as follows.
     It has been shown that the common kernel estimator for nonparametric regression function can be approximately expressed as From the above representation we find a new regression rule, i.e., r(x) can be regarded as the intercept by regressing (?)hj(x) on hj, so we can rebuild a linear regression model then get the estimator of r(x) by weighted least squares method. The newly proposed estimator has a simple structure and can achieve a smaller mean square error without use of the higher order kernel. We obtain and the optimal bandwidth is the order of O(n-1/9). Further, we find that if the bandwidths hj are not optimally selected but satisfy the following mild condition: hj=O(n-α) with 1/10<α<1/5, the new estimator r(x) still has smaller mean square error than the original one does. This means that the new estimator is robust to the bandwidth. Besides, under some mild conditions we obtain the asymptotic normality of the new estimator as follows, Thus the two-stage (or three-stage) regression estimation proposed in Chapter 2 by combining nonparametric regression with parametric regression can improve nonpara-metric estimation in the sense of both selection of bandwidth and convergence rate. More generally, this new method is also suitable for general nonparametric regression models regardless of the dimension of explanatory variable and the structure assump-tion on regression function, for example, it is extended to the estimation of multivariate nonparametric regression model and additive models.
     Motivated by covariate-adjusted regression (CAR) proposed by Senturk and Muller (2005) and an application problem which is to investigate the relationship between cal-cium absorption and calcium intake in addressing the problem of calcium deficiency where effects of body mass index and age are considered, in Chapter 3 we introduce and investigate a covariate-adjusted partially linear regression model (CAPLM) defined below, in which both response Y and predictor vector X can only be observed after being distorted by some multiplicative factorsψ(U) andφr(U) respectively, and an additional variable such as age or period T is taken into account. Although our model seems to be a special case of covariate-adjusted varying coefficient model (CAVCM) given by Senturk (2006), the data types of CAPLM and CAVCM are basically different. Observed measurements at a fixed time coming from different subjects which is a key issue enabling the application of CAR in the first step of the proposed estimation procedure in Senturk (2006), however the data we concerned might only consist of one observation at a fixed time. Then the methods for inferring the two models are different. As is shown by Cui et al (2008), we have As a result, we can construct the nonparametric estimators forψ(U) andφr(U). Then the true unobserved values of Y and X can be approximately recovered. Consequently, by replacing the true data with the recovered ones,βcan be estimated by the pro-file least squares method. Furthermore, under some mild conditions, the asymptotic normality of estimator for the parametric component is obtained, details can be seen in Section 3.3. Combined with the consistent estimate of asymptotic covariance we obtain confidence intervals for the regression coefficients.
     With the development of technology, people can easily obtain and store high di-mensionality data sets with the number of variables p comparable to or much larger than the sample size n. Variable selection plays an import role in the high dimen-sionality data analysis, among which the Dantzig selector performs variable selection and model fitting for linear and generalized linear models. In Chapter 4 we focus on variable selection for partially linear model via the Dantzig selector which is defined as, where X and Y are centered design matrix and centered response observations respec-tively. The large sample asymptotic properties of the Dantzig selector estimator are studied. When n tends to infinity while p is fixed, under some appropriate conditions, we haveβ(?)β0, whereβ0 solves We see that the Dantzig selector might not be consistent. To remedy this drawback, we take the adaptive Dantzig selector following Dicker and Lin (manuscript) defined as Moreover, we obtain that the adaptive Dantzig selector estimator for the parametric component of partially linear models also has the oracle property under some appro-priate conditions, i.e., assume all the regularity conditions hold and when n tends to infinity and p is fixed, we have the adaptive Dantzig selector estimatorβis consistent for model selection and As generalizations of the Dantzig selector, both the adaptive Dantzig selector and the Dantzig selector optimization can be implemented by the efficient algorithm DASSO proposed by James et al. (2009). Choices of tuning parameter and bandwidth are also discussed.
     In summary, we study the nonparametric and semiparametric regression models further. Firstly, we proposed a robust and bias-corrected estimator in nonparametric regression setting. The new two-stage (or three-stage) estimator has the mean square error with the order of 0(n-8/9) and is robust to the bandwidth selection. Secondly, we investigate the covariate-adjusted partially linear models, further, under some mild conditions, the asymptotic normality of the estimator and the confidence interval for the parametric components are obtained. Finally, we explore the issues on variable selection and parameters estimation for partially linear models. When the sample size n tends to infinity and the number of predictor variables p is fixed, the large sample asymptotic properties of the Dantzig selector estimator for the parameters are studied and the oracle properties of the adaptive Dantzig selector estimator are obtained under some appropriate conditions.
     Also some simulations and real data analysis are made to illustrate the new meth-ods.
引文
[1]Akaike, H. (1973). Information theorey and an extension of the maximum likelihood principle. In Proc.2nd International Symposium on Information Theory (V. Petrov and F. Csaki, eds.) 267-281. Akademiai Kiado, Budapest.
    [2]Bertail, P. and Politis, D. (2001). Extrapolation of subsampling distribution estimators: the i.i.d. and strong mixing cases. Sanad. J. Statist.,29,667-680.
    [3]Bickel, P. J., Ritov, Y. and Tsybakov, A. (2008). Simultaneous analysis of Lasso and Dantzig Selector. Annal of Statistics, to be published.
    [4]Breiman, L. (1996). Heuristics of instablity and stabilization in model selection. Annal of Statistics,24,2350-2382.
    [5]Buja, A., Hastie, T. and Tibshirani, R. (1989). Linear smoothers and additive models (with discussion). Annal of Statistics,17,453-555.
    [6]Candes, E., and Tao, T. (2007a). The Dantzig selector:statistical estimation when p is much larger than n. Annal of Statistics,35,2313-2351.
    [7]Candes, E. and Tao, T. (2007b). Rejoinder-the Dantzig selector:Statistical estimation when p is much larger than n. Annal of Statistics,35,2392-2404.
    [8]Carrol, R. J., Ruppert, D., Stefanski, L. A. and Crainiceanu, C. M. (2006). Measurement error in nonlinear models (2nd ed.) New York:chapman & Hall.
    [9]Chen, H. (1988). Convergence rates for parametric components in a partly linear model. Annal of Statistics,16,136-146.
    [10]Choi, E, Hall, P. and Rousson, V. (2000). Data sharpening methods for bias reduction in nonparametric regression. Annal of Statistics,28,1339-1355.
    [11]Cook, R. D. and Ni, L. (2005). Sufficient dimension reduction via inverse regression:A minimum discrepancy approach. J. Amer. Statist. Assoc.,100410-428.
    [12]Cook, R. D. (1998). Regression graphics:Ideas for Studying Regressions through Graph-ics. Wiley & Sons, New York.
    [13]Craven, P. and Wahba, G. (1979). Smoothing noisy data with spline functions:esti-mating the correct degree of smoothing by the method of generalized cross-validation. Numerical Mathematics,31,377-403.
    [14]Cui, X., Zhu, L. X. and Lin, L. (2007). A direct estimation for covariate-adjusted re-gression. (manuscript)
    [15]Cui, X., Guo, W. S., Lin, L. and Zhu, L. X. (2008). Covariate-adjusted nonlinear regres-sion. Annals of Statistics,37,1839-1870.
    [16]Craven, P. and Wahba, G. (1979). Smoothing noisy data with spline functions:Esti-mating the correct degree of smoothing by the method of generalized cross-validation. Numerishe Mathematik,31,377-403.
    [17]Davis, C. S. (2002). Statistical Methods for the Analysis of Repeated Measurements. New York:Springer, p.336.
    [18]Devroye, L. P. and Gyorfi, L. (1985). Nonparametric Density Estimation:The L1 View. Wiley, New York.
    [19]Dicker, L. and Lin X. (2009). A large sample analysis of the Dantzig slector and exten-sions. (manuscript).
    [20]Donoho, D. L. and Johnstone, I. M. (1994). Ideal spatial adaptation by wavelet shrinkage. Biometrika,81,425-455.
    [21]Donoho, D. L. and Huo, X. (2001). Uncertainty principles and ideal atomic decomposi-tion. IEEE Trans. Inform. Theory,47 2845-2862. MR1872845
    [22]Efromovich, S. (1999). Nonparametric Curve Estimation:Methods, Theory and Appli-cations. Springer-Verlag, New York.
    [23]Efron, B., Hastie, T. and Tibshirani, R. (2004). Least angle regression (with discussion). Annal of Statistics,32,407-451.
    [24]Engle, R. F., Granger, C. W. J., Rice, J.& Weiss, A. (1986). Semiparametric estimates of the relation between weather and electricity sales. Journal of the American Statistical Association,81,310-320.
    [25]Eubank, R. L. (1988). Spline Smoothing and Nonparametric Regression. Marcel Dekker, New York.
    [26]Fan, J. (1993). Local linear regression smoothers and their minimax efficiency. Annal of Statistics,21,196-216.
    [27]Fan, J. and Gijbels, I. (1996). Local Polynomial Modelling and Its Applications. Chap-man and Hall, London.
    [28]Fan, J., and Li, R. (2001). Variable Selection via Nonconcave Penalized Likelihood and Its Oracle Properties. Journal of the American Statistical Association,96,1348-1360.
    [29]Fan, J., and Li, R. (2004). New estimation and model selection procedures for semi-parametric modeling in longitudinal data analysis. Journal of the American Statistical Association,99,710-723.
    [30]Fan, J., and Peng, H. (2004). Nonconcave penalized likelihood with a diverging number of parameters. Annals of Statistics,32,928-961.
    [31]Fan, J. and Zhang, J. (2000). Two-step estimation of functional linear models with applications to longitudinal data.I. R. Statist. Soc., B 62,303-322.
    [32]Fan, J. and Ren, Y. (2006). Statistical analysis of DNA microarray data. Clinical Cancer Research,12,4469-4473.
    [33]Fan, J. and Lv, J. (2008). Sure independence screening for ultrahigh dimensional feature space. Journal of the Royal Statistical Society, Series B:Statistical Methodology,70, 849-911.
    [34]Fan, J. and Song, R. (2009). Sure independence screening in generalized linear models with NP-dimensionality. Revised for Anals of Statistics.
    [35]Fang, J. Q., Hardle, W and Mammen, E. (1998). Direct estimation of low-dimensional components in additive models. Annal of Statistics,26,943-971.
    [36]Friedman, J. H. and Stuetzle, W. (1981). Projection pursuit regression. Journal of Amer-ican Statistical Association,87,998-1004.
    [37]Fu, W. J. (1998). Penalized regressions:the Bridge versus the lasso. J. Computational and Graphical Statistics,7,397-416.
    [38]Gao, J. T. (1992). Large Sample Theory in Semiparametric Regression Models. Ph.D. Thesis, Graduate School, University of Science & Technology of China, Hefei, P.R. China.
    [39]Gao, J. T., Chen, X. R. and Zhao, L. C.(1994). Asymptotic normality of a class estimates in partly linear models. Acta Mathematica Sinica,37,156-268.
    [40]Gasser, Th. and Muller, H. G. (1979). Kernel estimation of regression functions. Smooth-ing Techniques for Curve Estimation (Th. Gasser and M. Rosenblatt, eds.), Springer Lecture Notes in Mathematics No.757, Springer-Verlag, Berlin,23-68.
    [41]Gordon, L. and Olshen, R. A. (1980). Consistent nonparametric regression from recursive partitioning scheme. Journal of Multivariate Analysis,10,611-627.
    [42]Gray, H. and Schucany, W. R. (1972). The generalized jackknife statistic. New York, M. Dekker.
    [43]Green, P. J. and Siverman, B. W. (1994). Nonparametric Regression and Generalized Linear Models:a Roughness Penalty Approach. Chapman and Hall, London.
    [44]Hall, P. and Patil, P. (1995). Formulae for mean integrated squared error of nonlinear wavelet-based density estimators. Annal of Statistics,23,905-928.
    [45]Hardle, W. (1990). Applied Nonparametric Regression. Cambridge University Press, Boston.
    [46]Hart, J. D. (1997). Nonparametric Smoothing and Lack-of Fit Tests. Springer-Verlag, New York, Inc.
    [47]Hastie, T. J. and Tibshirani, R. (1990). Generalized Additive Models. Chapman and Hall, London.
    [48]Hastie, T. J. and Tibshirani, R. (1993). Varying-coefficient models. J. Royal. Statist. Soc. B,55,757-796.
    [49]Heaney, R. P., Recker, R. R., Stegman, M. R. and Moy, A. J. (1989). Calcium absorption in women:relationships to calcium intake, estrogen status, age. Journal of Bone and Mineral Research,4,469-475.
    [50]Heaney, R. P. (2003). Normalizing calcium intake:projected population effects for body weight. Journal of Nutrition,133,268S-270S.
    [51]Heckman, N.E. (1986). Spline smoothing in partly linear models. Journal of the Royal Statistical Society, Series B,48,244-248.
    [52]Hjort, N. L. and Glad, I. K. (1995). Nonparametric density estimation with a parametric start. Annal of Statistics,23,882-904.
    [53]Hjort, N. L. and Jones, M. C. (1996). Locally parametric nonparametric density estima-tion. Annal of Statistics,24,1619-1647.
    [54]Hong, S. Y. (1991). Estimation theory of a class of semiparametric regression models. Sciences in China Ser. A,12,1258-1272.
    [55]Hong, S. Y. and Zhao, Z. B. (1993). Asymptotics for kernel-LS estimate of partially linear model. Chinese Ann Math A,14,717-731.
    [56]Horowitz, J. L. and Mammen, E. (2004). Nonparametric estimation of additive model with a link function. Annal of Statistics,32,2412-2443.
    [57]James, G. M. and Radchenko, P. (2009). A generalized Dantzig selector with shrinkage tuning. Biometrika,96,323-337.
    [58]James, G. M., Radchenko, P. and Lv, J. C. (2009). Dasso:connections between the Dantzig selector and lasso. Journal of the Royal Statistical Society, Series B,71,127-142.
    [59]Jiang, J., Lahiri, P. and Wan, S. (2002). An unified jackknife theory for empirical best prediction with M-estimation. Annal of Statistics,30,1782-1810.
    [60]Kinght, K. and Fu, W. J. (2000). Asymptotics for Lasso-Type Estimators. Annal of Statistics,28,1356-1378.
    [61]Li, Y. and Dicker, L. (2009). Dantzig selector for censored linear regression models. (manuscript).
    [62]Liang, H. (1992). Asymptotic Efficiency in Semiparametric Models and Related Topics. Thesis, Institute of Systems Science, Chinese Academy of Sciences, Beijing, P.R. China.
    [63]Liang, H., Hardle, W., Carroll R. J. (1999). Estimation In a Semiparametric Partially Linear Errors-In-Variables Model. Annals of Statistics,27,1519-1535.
    [64]Liang, H. and Li, R. Z. (2009) Variable selection for partially linear models with mea-surement errors. Journal of the American Statistical Association,104,234-248.
    [65]Mallows, C. L. (1973). Some comments on Cp. Technometrics,12,661-675.
    [66]Marron, J. S. and Wand, M. P. (1992). Exact mean integrated squared error. Annal of Statistics,20,712-736.
    [67]Meinshausen, N., Rocha, G. and Yu, B. (2007). A tale of three cousins:Lasso, L2Boosging, and Dantzig(discussion on Candes and Tao's Dantzig Selector paper), An-nal of Statistics,35,2372-2384.
    [68]Miller, R. (1974). The jackknife-a review. Biometrika,61,1-15.
    [69]Nadaraya, E. A. (1964). On estimating regression. Theory of Probability and its Appli-cations,9,141-142.
    [70]Naito, K. (2004). Semiparametric density estimation by local L2 fitting. Annal of Statis-tics,32,1162-1191.
    [71]Ogden, T. (1997). Essential wavelets for statistical applications and data analysis. Birkhuser Boston, Boston.
    [72]Priestley, M. B. and Chao, M. T. (1972). Non-parametric function fitting. Journal of Royal Statistic Society, Series B,34,385-392.
    [73]Rice, J.(1986). Convergence rates for partially splined models. Statistics & Probability Letters,4,203-208.
    [74]Robinson, P.M.(1988). Root-n-consistent semiparametric regression. Econometrica,56, 931-954.
    [75]Ruppert, D., Sheather, S.J., and Wand, M.P. (1995), An effective bandwidth selector for local least squares regression. Journal of the American Statistical Association,90, 1257-1270.
    [76]Schwarz, G. (1978). Estimating the dimension of a model. Annal of Statistics,6,461-464.
    [77]Senturk, D. and Muller, H. G. (2005). Covariate-adjusted regression. Biometrika,92, 75-89.
    [78]Sentiirk, D. and Miiller, H. G. (2006). Inference for covariate adjusted regression via varying coefficient models. Annals of Statistics,34,654-679.
    [79]Senturk, D. (2006). Covariate-adjusted varying coefficient models. Biostatistics,7,235-251.
    [80]Silverman, B. W. (1986). Density Estiamtion for Statistics and Data Analysis. Chapman and Hall, London.
    [81]Simonoff, J. S. (1996). Smoothing Mehods in Statistics. Springer, New York.
    [82]Speckman, P. (1988). Kernel smoothing in partial linear models. Journal of the Royal Statistical Society, Series B,50,413-436.
    [83]Stone, M. (1974). Cross-validatory choice and assessment of statistical predictions (with discussion). J. Royal Statist. Soc. B,36,111-147.
    [84]Stone, C. J. (1977). Consistent nonparametric regression. Annal of Statistics,5,595-645.
    [85]Suppert, D. and Wang, M. P. (1994). Multivariate weighted least squares regression. Annal of Statistics,22,1346-1370.
    [86]Tibshirani, R. (1996) Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society, Series B,58,267-288.
    [87]Tibshirani, R., Hastie, T., Narasimhan, B.& Chu, G. (2003), Class prediction by nearest shrunken centroids, with applications to DNA microarrays. Statist. Sci.,18,104-117.
    [88]Tong, T and Wang, Y. (2005). Estimating residual in nonparametric regression using least squares. Biometrika,92,821-830.
    [89]Wahba, G. (1977). A survey of some smoothing problems and the method of generalized cross-validation for solving them. In Krisnaiah, P. R. (ed.), Applications of Statistics. North Holland:Amsterdam, pp.507-523.
    [90]Wahba, G. (1990). Spline Models for observational Data. SIAM, Philadelphia.
    [91]Wand, M. P. and Jones, M. C. (1995). Kernel Smoothing. Chapman and Hall, London.
    [92]Watson, G. S. (1964). Smooth regression analysis. Sankhya Ser. A 26,359-372.
    [93]Wu, F. J. (1986). Jackknife, bootstrap and other resampling methods in regression analysis. Annal of Statistics,14,1261-1295.
    [94]Xie, H. L. and Huang, J. (2009). SCAD-penalized regression in high-dimensional par-tially linear models. Annal of Statistics,37,673-696.
    [95]Zou, H.& Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society, Series B,67,301-320.
    [96]Zou, H. and Li, R. (2008). One-step sparse estimates in nonconcave penalized likelihood models. Annal of Statistics,36,1509-1533.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700