Tibshirani, Robert
Overview
Works:  101 works in 359 publications in 3 languages and 4,670 library holdings 

Roles:  Author, Thesis advisor, Contributor, Other 
Classifications:  Q325.75, 519.544 
Publication Timeline
.
Most widely held works about
Robert Tibshirani
 Survey modelassisted estimation with the lasso by Cooper S Schumacher( )
Most widely held works by
Robert Tibshirani
The elements of statistical learning : data mining, inference, and prediction : with 200 fullcolor illustrations by
Trevor Hastie(
Book
)
80 editions published between 2001 and 2016 in English and held by 1,558 WorldCat member libraries worldwide
Describes important statistical ideas in machine learning, data mining, and bioinformatics. Covers a broad range, from supervised learning (prediction), to unsupervised learning, including classification trees, neural networks, and support vector machines
80 editions published between 2001 and 2016 in English and held by 1,558 WorldCat member libraries worldwide
Describes important statistical ideas in machine learning, data mining, and bioinformatics. Covers a broad range, from supervised learning (prediction), to unsupervised learning, including classification trees, neural networks, and support vector machines
An introduction to the bootstrap by
Bradley Efron(
Book
)
33 editions published between 1993 and 1998 in English and held by 1,089 WorldCat member libraries worldwide
The accuracy of a simple mean  Random simples and probabilities  The empirical distribution function and the plugin principle  Standard errors and estimated Standard errors  The bootstrap estimate of Standard error  Bootstrap Standard errors : some examples  More complicated data structures  Regression models  Estimates of bias  The jackknife  Confidence intervals based on bootstrap "tables"  Confidence intervals based on bootstrap percentiles  Better bootstrap confidence intervals  Permutation tests  Hypothesis testing with the bootstrap  Crossvalidation and other estimates of prediction error  Adaptive estimation and calibration  Assessing the error in bootstrap estimates  A geometrical representation for the bootstrap and jackknife  An overview of nonparametric and parametric inference  Further topics in bootstrap confidence intervals  Efficient bootstrap computations  Approximate likelihoods  Bootstrap bioequivalence  Discussion and further topics
33 editions published between 1993 and 1998 in English and held by 1,089 WorldCat member libraries worldwide
The accuracy of a simple mean  Random simples and probabilities  The empirical distribution function and the plugin principle  Standard errors and estimated Standard errors  The bootstrap estimate of Standard error  Bootstrap Standard errors : some examples  More complicated data structures  Regression models  Estimates of bias  The jackknife  Confidence intervals based on bootstrap "tables"  Confidence intervals based on bootstrap percentiles  Better bootstrap confidence intervals  Permutation tests  Hypothesis testing with the bootstrap  Crossvalidation and other estimates of prediction error  Adaptive estimation and calibration  Assessing the error in bootstrap estimates  A geometrical representation for the bootstrap and jackknife  An overview of nonparametric and parametric inference  Further topics in bootstrap confidence intervals  Efficient bootstrap computations  Approximate likelihoods  Bootstrap bioequivalence  Discussion and further topics
Generalized additive models by
Trevor Hastie(
Book
)
34 editions published between 1984 and 1999 in English and Undetermined and held by 575 WorldCat member libraries worldwide
Likelihood based regression models, such as the normal linear regression model and the linear logistic model, assume a linear (or some other parametric) form for the covariate effects. The authors introduce the Local Scoring procedure which is applicable to any likelihoodbased regression model: the class of Generalized Linear Models contains many of these. In this class the Local Scoring procedure replaces a linear predictor by a additive predictor; hence the name Generalized Additive Models. Local Scoring can also be applied to nonstandard models like Cox's proportional hazards model for survival data
34 editions published between 1984 and 1999 in English and Undetermined and held by 575 WorldCat member libraries worldwide
Likelihood based regression models, such as the normal linear regression model and the linear logistic model, assume a linear (or some other parametric) form for the covariate effects. The authors introduce the Local Scoring procedure which is applicable to any likelihoodbased regression model: the class of Generalized Linear Models contains many of these. In this class the Local Scoring procedure replaces a linear predictor by a additive predictor; hence the name Generalized Additive Models. Local Scoring can also be applied to nonstandard models like Cox's proportional hazards model for survival data
An introduction to statistical learning : with applications in R by
Gareth James(
Book
)
14 editions published between 2013 and 2015 in English and held by 333 WorldCat member libraries worldwide
"An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance to marketing to astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, treebased methods, support vector machines, clustering, and more. Color graphics and realworld examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors cowrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and nonstatisticians alike who wish to use cuttingedge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra. Provides tools for Statistical Learning that are essential for practitioners in science, industry and other fields. Analyses and methods are presented in R. Topics include linear regression, classification, resampling methods, shrinkage approaches, treebased methods, support vector machines, and clustering. Extensive use of color graphics assist the reader"Publisher description
14 editions published between 2013 and 2015 in English and held by 333 WorldCat member libraries worldwide
"An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance to marketing to astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, treebased methods, support vector machines, clustering, and more. Color graphics and realworld examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors cowrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and nonstatisticians alike who wish to use cuttingedge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra. Provides tools for Statistical Learning that are essential for practitioners in science, industry and other fields. Analyses and methods are presented in R. Topics include linear regression, classification, resampling methods, shrinkage approaches, treebased methods, support vector machines, and clustering. Extensive use of color graphics assist the reader"Publisher description
The science of Bradley Efron : selected papers by
Bradley Efron(
Book
)
9 editions published between 2008 and 2011 in English and held by 100 WorldCat member libraries worldwide
9 editions published between 2008 and 2011 in English and held by 100 WorldCat member libraries worldwide
Statistical learning with sparsity : the lasso and generalizations by
Trevor Hastie(
Book
)
17 editions published in 2015 in 3 languages and held by 96 WorldCat member libraries worldwide
Discover New Methods for Dealing with HighDimensional DataA sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data.Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of ℓ1 penalties to generalized l
17 editions published in 2015 in 3 languages and held by 96 WorldCat member libraries worldwide
Discover New Methods for Dealing with HighDimensional DataA sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data.Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of ℓ1 penalties to generalized l
Local likelihood estimation by
Robert Tibshirani(
Book
)
9 editions published between 1984 and 1986 in English and held by 8 WorldCat member libraries worldwide
This paper extends the idea of local averaging to likelihood based models. One such application is to the class of generalized linear models. The author enlarges this class by replacing the covariate from chi beta with an unspecified smooth function. This function is estimated from the data by a technique called Local Likelihood Estimation  a type of local averaging. Multiple covariates are incorporated through a forward stepwise algorithm. The main application discussed however, is to the proportional hazards model of Cox (1972), for censored data, In a number of real data examples, the local likelihood technique proves to be effective in uncovering nonlinear dependencies. Finally, the author gives some asymptotic results for local likelihood estimates and provides some methods for inference
9 editions published between 1984 and 1986 in English and held by 8 WorldCat member libraries worldwide
This paper extends the idea of local averaging to likelihood based models. One such application is to the class of generalized linear models. The author enlarges this class by replacing the covariate from chi beta with an unspecified smooth function. This function is estimated from the data by a technique called Local Likelihood Estimation  a type of local averaging. Multiple covariates are incorporated through a forward stepwise algorithm. The main application discussed however, is to the proportional hazards model of Cox (1972), for censored data, In a number of real data examples, the local likelihood technique proves to be effective in uncovering nonlinear dependencies. Finally, the author gives some asymptotic results for local likelihood estimates and provides some methods for inference
Tōkeiteki gakushū no kiso : dēta mainingu suiron yosoku(
Book
)
2 editions published in 2014 in Japanese and held by 6 WorldCat member libraries worldwide
2 editions published in 2014 in Japanese and held by 6 WorldCat member libraries worldwide
Bootstrap Confidence Intervals by
Robert Tibshirani(
Book
)
7 editions published in 1984 in English and held by 5 WorldCat member libraries worldwide
We describe the various techniques that were proposed for constructing nonparametric confidence intervals using the bootstrap. These include bootstrap pivotal intervals, percentile and biascorrected percentile intervals, and nonparametric titling intervals. These methods are small sample improvements over the usual + or  standard deviation intervals. We discuss them in detail, outlining underlying assumptions in each case. We show how the nonparametric titling interval can be viewed as an extension of a bootstrap pivotal interval, and suggest a number of generalizations. Finally, the various intervals are compared in a small simulation study
7 editions published in 1984 in English and held by 5 WorldCat member libraries worldwide
We describe the various techniques that were proposed for constructing nonparametric confidence intervals using the bootstrap. These include bootstrap pivotal intervals, percentile and biascorrected percentile intervals, and nonparametric titling intervals. These methods are small sample improvements over the usual + or  standard deviation intervals. We discuss them in detail, outlining underlying assumptions in each case. We show how the nonparametric titling interval can be viewed as an extension of a bootstrap pivotal interval, and suggest a number of generalizations. Finally, the various intervals are compared in a small simulation study
The problem of regions by
Bradley Efron(
Book
)
5 editions published in 1997 in English and held by 4 WorldCat member libraries worldwide
5 editions published in 1997 in English and held by 4 WorldCat member libraries worldwide
Crossvalidation and the bootstrap: estimating the error rate of a prediction rule by
Bradley Efron(
Book
)
5 editions published in 1995 in English and held by 4 WorldCat member libraries worldwide
5 editions published in 1995 in English and held by 4 WorldCat member libraries worldwide
Bootstrap confidence intervals and bootstrap approximations by
Thomas J DiCiccio(
Book
)
4 editions published between 1985 and 1986 in English and held by 4 WorldCat member libraries worldwide
This document studies the BC sub a bootstrap procedure for constructing parametric and nonparametric confidence intervals. The BC sub a interval relies on the existence of a transformation that maps the problem into a normal scaled transformation family. The authors show how to construct this transformation in general. Exploiting this, they derive an interval that equals the BC sub a interval to second order, computable without bootstrap sampling. As a further benefit, this construction provides a second order correct approximation to the bootstrap distribution of a statistic, computed without bootstrap sampling. Both the new interval and the approximation require only n+2 evaluations of the statistic, where n is the sample size. (Author)
4 editions published between 1985 and 1986 in English and held by 4 WorldCat member libraries worldwide
This document studies the BC sub a bootstrap procedure for constructing parametric and nonparametric confidence intervals. The BC sub a interval relies on the existence of a transformation that maps the problem into a normal scaled transformation family. The authors show how to construct this transformation in general. Exploiting this, they derive an interval that equals the BC sub a interval to second order, computable without bootstrap sampling. As a further benefit, this construction provides a second order correct approximation to the bootstrap distribution of a statistic, computed without bootstrap sampling. Both the new interval and the approximation require only n+2 evaluations of the statistic, where n is the sample size. (Author)
How many bootstraps? by
Robert Tibshirani(
Book
)
4 editions published in 1985 in English and held by 4 WorldCat member libraries worldwide
The bootstrap is a nonparametric method for assessing statistical accuracy. In approximating bootstrap quantities by monte carlo simulation, one must decide how many bootstrap samples to generate. This document proposes an adaptive sequential method that estimates the accuracy based on the current bootstrap samples. Bootstrap sampling is continued until the estimated accuracy is high enough. In the examples given, 100 to 300 bootstraps are sufficient for standard error and bias estimation, while 1000 bootstraps may be necessary for estimating a percentile. Additional keywords: Tables(data)
4 editions published in 1985 in English and held by 4 WorldCat member libraries worldwide
The bootstrap is a nonparametric method for assessing statistical accuracy. In approximating bootstrap quantities by monte carlo simulation, one must decide how many bootstrap samples to generate. This document proposes an adaptive sequential method that estimates the accuracy based on the current bootstrap samples. Bootstrap sampling is continued until the estimated accuracy is high enough. In the examples given, 100 to 300 bootstraps are sufficient for standard error and bias estimation, while 1000 bootstraps may be necessary for estimating a percentile. Additional keywords: Tables(data)
The bootstrap method for assessing statistical accuracy by
Bradley Efron(
Book
)
5 editions published in 1985 in English and held by 4 WorldCat member libraries worldwide
This is an invited review of bootstrap methods. It begins with an exposition of the bootstrap estimate of standard error for onesample situations. Several examples, some involving quite complicated statistical procedures, are given. The bootstrap is then extended to other measures of statistical accuracy, like bias and prediction error, and to complicated data structures such as time series, censored data, and regression models. Several more examples are presented illustrating these ideas. The last third of the paper deals mainly with bootstrap confidence intervals. The paper ends with a FORTRAN program for bootstrap standard errors
5 editions published in 1985 in English and held by 4 WorldCat member libraries worldwide
This is an invited review of bootstrap methods. It begins with an exposition of the bootstrap estimate of standard error for onesample situations. Several examples, some involving quite complicated statistical procedures, are given. The bootstrap is then extended to other measures of statistical accuracy, like bias and prediction error, and to complicated data structures such as time series, censored data, and regression models. Several more examples are presented illustrating these ideas. The last third of the paper deals mainly with bootstrap confidence intervals. The paper ends with a FORTRAN program for bootstrap standard errors
Prevalidation and inference in microarrays by
Robert Tibshirani(
Book
)
4 editions published in 2002 in English and held by 3 WorldCat member libraries worldwide
4 editions published in 2002 in English and held by 3 WorldCat member libraries worldwide
Statistical data analysis in the computer age by
Bradley Efron(
Book
)
5 editions published between 1990 and 1991 in English and held by 3 WorldCat member libraries worldwide
5 editions published between 1990 and 1991 in English and held by 3 WorldCat member libraries worldwide
Computerintensive statistical methods by
Bradley Efron(
Book
)
3 editions published in 1995 in English and held by 3 WorldCat member libraries worldwide
3 editions published in 1995 in English and held by 3 WorldCat member libraries worldwide
Additive logistic regression : a statistical view of boosting by
J. H Friedman(
Book
)
3 editions published in 1998 in English and held by 3 WorldCat member libraries worldwide
3 editions published in 1998 in English and held by 3 WorldCat member libraries worldwide
Using specially designed exponential families for density estimation by
Bradley Efron(
Book
)
4 editions published in 1994 in English and held by 2 WorldCat member libraries worldwide
4 editions published in 1994 in English and held by 2 WorldCat member libraries worldwide
Discriminant adaptive nearest neighbor classification by
Trevor Hastie(
Book
)
4 editions published in 1994 in English and held by 2 WorldCat member libraries worldwide
4 editions published in 1994 in English and held by 2 WorldCat member libraries worldwide
more
fewer
Audience Level
0 

1  
Kids  General  Special 
Related Identities
 Friedman, J. H. (Jerome H.) Thesis advisor Author
 Hastie, Trevor Other Thesis advisor Author
 Hastie, Trevor Author
 Efron, Bradley Speaker Thesis advisor Author
 Witten, Daniela Other
 James, Gareth (Gareth Michael) Author
 Wainwright, Martin Author Contributor
 Morris, Carl N.
 Efron, Bradley Author
 Stanford University Department of Statistics
Useful Links
Associated Subjects
Artificial intelligence Asymptotic efficiencies (Statistics) Bayesian statistical decision theory Bioinformatics BiologyData processing Bootstrap (Statistics) Computational biology Computational intelligence Computers Computer science Confidence intervals Database management Data mining Efron, Bradley Electronic data processing Estimation theory Forecasting Forests and forestryData processing Forest surveys Inference Least squares Linear models (Statistics) Machine learning Mathematical models Mathematical statistics MathematicsData processing Proof theory R (Computer program language) Random walks (Mathematics) Regression analysis Regression analysisMathematical modelsEvaluation Sampling (Statistics) Smoothing (Statistics) Sparse matrices Statistics StatisticsMethodology Supervised learning (Machine learning) Transformations (Mathematics) United States
Alternative Names
Robert Tibshirani Amerikaans statisticus
Robert Tibshirani statisticien canadoaméricain
Robert Tibshirani statistico statunitense
Tibshirani, R. J.
Tibshirani, R. J. 1956
Tibshirani, R. J. (Robert J.)
Tibshirani, R. J. (Robert J.) 1956
Tibshirani, Rob.
Tibshirani, Rob J.
Tibshirani, Robert.
Tibshirani, Robert J.
Tibshirani, Robert J. 1956...
Tibshirani, Robert John 1956
Роберт Тибширани
Languages
Covers