# Tail inverse regression for dimension reduction with extreme response

@inproceedings{Aghbalou2021TailIR, title={Tail inverse regression for dimension reduction with extreme response}, author={Anass Aghbalou and Franccois Portier and Anne Sabourin and Chen Zhou}, year={2021} }

We consider the problem of dimensionality reduction for prediction of a target Y ∈ R to be explained by a covariate vector X ∈ Rp, with a particular focus on extreme values of Y which are of particular concern for risk management. The general purpose is to reduce the dimensionality of the statistical problem through an orthogonal projection on a lower dimensional subspace of the covariate space. Inspired by the sliced inverse regression (SIR) methods, we develop a novel framework (TIREX, Tail… Expand

#### References

SHOWING 1-10 OF 65 REFERENCES

Sufficient Dimension Reduction via Inverse Regression

- Mathematics
- 2005

A family of dimension-reduction methods, the inverse regression (IR) family, is developed by minimizing a quadratic objective function. An optimal member of this family, the inverse regression… Expand

Dimension reduction in multivariate extreme value analysis

- Mathematics
- 2015

Non-parametric assessment of extreme dependence structures between an arbitrary number of variables, though quite well-established in dimension 2 and recently extended to moderate dimensions such as… Expand

Tail dimension reduction for extreme quantile estimation

- Mathematics
- 2018

In a regression context where a response variable Y ∈ ℝ is recorded with a covariate X ∈ ℝp, two situations can occur simultaneously: (a) we are interested in the tail of the conditional distribution… Expand

Kernel dimension reduction in regression

- Mathematics
- 2009

We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives directly from the formulation of SDR in terms of the conditional independence of the covariate X from… Expand

Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces

- Computer Science
- J. Mach. Learn. Res.
- 2004

This work treats the problem of dimensionality reduction as that of finding a low-dimensional “effective subspace” of X which retains the statistical relationship between X and Y and establishes a general nonparametric characterization of conditional independence using covariance operators on a reproducing kernel Hilbert space. Expand

Sliced Inverse Regression for Dimension Reduction

- Mathematics
- 1991

Abstract Modern advances in computing power have greatly widened scientists' scope in gathering and investigating information from many variables, information which might have been ignored in the… Expand

Principal component analysis for multivariate extremes

- Mathematics
- 2019

The first order behavior of multivariate heavy-tailed random vectors above large radial thresholds is ruled by a limit measure in a regular variation framework. For a high dimensional vector, a… Expand

Sliced inverse regression with regularizations.

- Mathematics, Medicine
- Biometrics
- 2008

The L2 regularization is introduced, and an alternating least-squares algorithm is developed, to enable SIR to work with n < p and highly correlated predictors and simultaneous reduction estimation and predictor selection. Expand

Investigating Smooth Multiple Regression by the Method of Average Derivatives

- Mathematics
- 2015

Abstract Let (x 1, …, xk, y) be a random vector where y denotes a response on the vector x of predictor variables. In this article we propose a technique [termed average derivative estimation (ADE)]… Expand

Dimension Reduction in Regressions Through Cumulative Slicing Estimation

- Mathematics
- 2010

In this paper we offer a complete methodology of cumulative slicing estimation to sufficient dimension reduction. In parallel to the classical slicing estimation, we develop three methods that are… Expand