robustPoetEst() implements the robust version of Principal Orthogonal complEment Thresholding (POET) estimator, a nonparametric, unobserved-factor-based estimator of the covariance matrix when the underlying distribution is elliptical (Fan et al. 2018) . The estimator is defined as the sum of the sample covariance matrix's rank-k approximation and its post-thresholding principal orthogonal complement. The rank-k approximation is constructed from the sample covariance matrix, its leading eigenvalues, and its leading eigenvectors. The sample covariance matrix and leading eigenvalues are initially estimated via an M-estimation procedure and the marginal Kendall's tau estimator. The leading eigenvectors are estimated using spatial Kendall's tau estimator. The hard thresholding function is used to regularize the idiosyncratic errors' estimated covariance matrix, though other regularization schemes could be used.

We do not recommend that this estimator be employed when the estimand is the correlation matrix. The diagonal entries of the resulting estimate are not guaranteed to be equal to one.

robustPoetEst(dat, k, lambda, var_est = c("sample", "mad", "huber"))

Arguments

dat

A numeric data.frame, matrix, or similar object.

k

An integer indicating the number of unobserved latent factors. Empirical evidence suggests that the POET estimator is robust to overestimation of this hyperparameter (Fan et al. 2013) . In practice, it is therefore preferable to use larger values.

lambda

A non-negative numeric defining the amount of thresholding applied to each element of sample covariance matrix's orthogonal complement.

var_est

A character dictating which variance estimator to use. This must be one of the strings "sample", "mad", or "huber". "sample" uses sample variances; "mad" estimates variances via median absolute deviation; "huber" uses an M-estimator for variance under the Huber loss.

Value

A matrix corresponding to the estimate of the covariance matrix.

References

Fan J, Liao Y, Mincheva M (2013). “Large covariance estimation by thresholding principal orthogonal complements.” Journal of the Royal Statistical Society. Series B (Statistical Methodology), 75(4), 603--680. ISSN 13697412, 14679868, https://www.jstor.org/stable/24772450.

Fan J, Liu H, Wang W (2018). “Large covariance estimation through elliptical factor models.” Ann. Statist., 46(4), 1383--1414. doi: 10.1214/17-AOS1588 .

Examples

robustPoetEst(dat = mtcars, k = 2L, lambda = 0.1, var_est = "sample")
#>              [,1]        [,2]        [,3]       [,4]         [,5]        [,6]
#>  [1,]   36.315178  -9.1164387  -693.67785 -376.03067   2.11448003  -5.3349120
#>  [2,]   -9.116439   2.7537461   190.39219  102.45344  -0.61276404   1.3986029
#>  [3,] -693.677853 190.3921862 15358.02683 7267.79320 -46.11029191 110.9814745
#>  [4,] -376.030665 102.4534357  7267.79320 4696.88357 -20.27007620  54.3234998
#>  [5,]    2.114480  -0.6127640   -46.11029  -20.27008   0.21013981  -0.4787939
#>  [6,]   -5.334912   1.3986029   110.98147   54.32350  -0.47879387   0.9573022
#>  [7,]    5.076855  -1.7307761  -100.04564  -81.84586   0.04839227  -0.3844377
#>  [8,]    1.851765  -0.6218331   -38.85915  -22.17540   0.15703000  -0.2555406
#>  [9,]    1.484410  -0.3848420   -33.54033  -11.16801   0.20367457  -0.3061374
#> [10,]    2.280022  -0.6167975   -51.04325  -17.11729   0.18503478  -0.4043015
#> [11,]   -6.199445   1.4396723   107.50032   80.26554  -0.11182020   0.7713457
#>                [,7]         [,8]         [,9]       [,10]        [,11]
#>  [1,]    5.07685520   1.85176484   1.48440950   2.2800219  -6.19944529
#>  [2,]   -1.73077606  -0.62183308  -0.38484203  -0.6167975   1.43967234
#>  [3,] -100.04564404 -38.85915432 -33.54032586 -51.0432471 107.50032232
#>  [4,]  -81.84586275 -22.17539543 -11.16800914 -17.1172950  80.26554184
#>  [5,]    0.04839227   0.15703000   0.20367457   0.1850348  -0.11182020
#>  [6,]   -0.38443765  -0.25554058  -0.30613738  -0.4043015   0.77134565
#>  [7,]    3.19310208   0.60378091  -0.16563939  -0.1499862  -1.85214514
#>  [8,]    0.60378091   0.11854837   0.03335364   0.1292760  -0.37857863
#>  [9,]   -0.16563939   0.03335364   0.21142802   0.2100142  -0.04591921
#> [10,]   -0.14998618   0.12927602   0.21001418   0.4573464   0.12806927
#> [11,]   -1.85214514  -0.37857863  -0.04591921   0.1280693   2.44068633