The Optimal Estimation of Lasso
Science Journal of Applied Mathematics and Statistics
Volume 3, Issue 6, December 2015, Pages: 293-297
Received: Dec. 30, 2015; Published: Dec. 30, 2015
Views 3926      Downloads 114
Author
Huiyi Xia, Department of Mathematics and Computer Science, Chizhou University, Anhui, China
Article Tools
Follow on us
Abstract
The estimation of lasso is important problem of high dimensional data; the optimal estimation’s formula of lasso is unsolved riddle of high dimensional data. In order to solve this problem, we give the structure of lasso estimation by using mathematical method in the orthogonal design. The optimal estimation’s formula of lasso is solved in the orthogonal design, it is pointed out that there is a gradual process of dimension reduction by using method of lasso.
Keywords
Lasso, Estimation, Solution
To cite this article
Huiyi Xia, The Optimal Estimation of Lasso, Science Journal of Applied Mathematics and Statistics. Vol. 3, No. 6, 2015, pp. 293-297. doi: 10.11648/j.sjams.20150306.19
References
[1]
R. Tibshirani. “Regression Shrinkage and Selection Via the Lasso,” Journal of the Royal Statistical Society. 1996, Series B, 58(1), pp: 267-288.
[2]
B. Efron, T. Hastie, I. “Johnstone and R. Tibshirani. Least Angle Regression,” The Annals of Statistics 2004, Vol. 32. No. 2, pp: 407-499.
[3]
J. Fan and R. Li. “Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties,” Journal of the American Statistical Association. 2001, Vol. 96, No. 456, pp: 1348-1360.
[4]
K. Knight, W. Fu. “Asymptotics for Lasso-Type Estimators,” The Annals of Statistics. 2000, Vol.28, No. 5, pp: 1356-1378.
[5]
H. Zou, H. Zhang, “On the Adaptive Elastic-net with a Diverging Number of Parameters” The Annals of Statistics. 2009, 37(4), pp: 1733–1751.
[6]
D. Donohu, I. Johnstone, “Ideal spatial adaption by wavelet shinkage,” Biometrica. 1994, 81, pp: 425-455.
[7]
H. Zou, T. Hastie, “Regularization and variable selection via the elastic net,” Journal of the Royal Statistical Society. 2005, Series B, 67(2), pp: 301-320.
[8]
R. Tibshirani, M. Saunders, S. Rossrt, J. Zhu, K. Knight, “Sparsity and smoothness via the fused lasso,” Journal of the Royal Statistical Society. 2005, Series B, 67(1), pp: 91-108.
[9]
L. Wasserman, K. Roeder, “HIGH-DIMENSIONAL VARIABLE SELECTION,” The Annals of Statistics. 2009, 37(5A), pp: 2718–2201.
[10]
E. Austin, W. Pan, X. Shen, “Penalized Regression and Risk Prediction in Genome-Wide Association Studies,” Stat Anal Data Min. 2013, 6(4), pp: 1: 23.
[11]
L. Wu, Y. Yang, H. Liu, “Nonnegative-lasso and in index tracking,” Computational Statistics and Data Analysis. 2014, 70, pp: 116-126.
[12]
F. Bunea, J. Leder, Y. She, “The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms,” Information Theory IEEE Transactions on. 2013, 60(2), pp: 1313-1325.
[13]
A. Ahrens, A. Bhattacharjee, “Two-Step Lasso Estimation of the Spatial weighs Matrix,” Econometrics. 2015, 3, pp: 128-155.
ADDRESS
Science Publishing Group
1 Rockefeller Plaza,
10th and 11th Floors,
New York, NY 10020
U.S.A.
Tel: (001)347-983-5186