skip to content
Introduction to linear regression analysis Preview this item
ClosePreview this item
Checking...

Introduction to linear regression analysis

Author: Douglas C Montgomery; Elizabeth A Peck; G Geoffrey Vining
Publisher: Hoboken, NJ : Wiley, 2012.
Series: Wiley series in probability and statistics.
Edition/Format:   Print book : English : 5th edView all editions and formats
Summary:
"This book describes both the conventional and less common uses of linear regression in the practical context of today's mathematical and scientific research"--
Rating:

(not yet rated) 0 with reviews - Be the first.

Subjects
More like this

Find a copy online

Links to this item

Find a copy in the library

&AllPage.SpinnerRetrieving; Finding libraries that hold this item...

Details

Material Type: Internet resource
Document Type: Book, Internet Resource
All Authors / Contributors: Douglas C Montgomery; Elizabeth A Peck; G Geoffrey Vining
ISBN: 9780470542811 0470542810 9781119180173 1119180171
OCLC Number: 775329531
Description: xvi, 645 pages : illustrations ; 27 cm.
Contents: 1. INTRODUCTION --
1.1 Regression and Model Building --
1.2 Data Collection --
1.3 Uses of Regression --
1.4 Role of the Computer --
2. SIMPLE LINEAR REGRESSION --
2.1 Simple Linear Regression Model --
2.2 Least-Squares Estimation of the Parameters --
2.3 Hypothesis Testing on the Slope and Intercept --
2.4 Interval Estimation in Simple Linear Regression --
2.5 Prediction of New Observations --
2.6 Coefficient of Determination --
2.7 A Service Industry Application of Regression --
2.8 Using SAS and R for Simple Linear Regression --
2.9 Some Considerations in the Use of Regression --
2.10 Regression Through the Origin --
2.11 Estimation by Maximum Likelihood --
2.12 Case Where the Regressor x is Random --
3. MULTIPLE LINEAR REGRESSION --
3.1 Multiple Regression Models --
3.2 Estimation of the Model Parameters --
3.3 Hypothesis Testing in Multiple Linear Regression --
3.4 Confidence Intervals in Multiple Regression --
3.5 Prediction of New Observations --
3.6 A Multiple Regression Model for the Patient Satisfaction Data --
3.7 Using SAS and R for Basic Multiple Linear Regression --
3.8 Hidden Extrapolation in Multiple Regression --
3.9 Standardized Regression Coeffi cients --
3.10 Multicollinearity --
3.11 Why Do Regression Coeffi cients Have the Wrong Sign? 4. MODEL ADEQUACY CHECKING --
4.1 Introduction --
4.2 Residual Analysis --
4.3 PRESS Statistic --
4.4 Detection and Treatment of Outliers --
4.5 Lack of Fit of the Regression Model --
5. TRANSFORMATIONS AND WEIGHTING TO CORRECT MODEL INADEQUACIES --
5.1 Introduction --
5.2 Variance-Stabilizing Transformations --
5.3 Transformations to Linearize the Model --
5.4 Analytical Methods for Selecting a Transformation --
5.5 Generalized and Weighted Least Squares --
5.6 Regression Models with Random Effect --
6. DIAGNOSTICS FOR LEVERAGE AND INFLUENCE --
6.1 Importance of Detecting Infl uential Observations --
6.2 Leverage --
6.3 Measures of Infl uence: Cook's D --
6.4 Measures of Infl uence: DFFITS and DFBETAS --
6.5 A Measure of Model Performance --
6.6 Detecting Groups of Infl uential Observations --
6.7 Treatment of Infl uential Observations --
7. POLYNOMIAL REGRESSION MODELS --
7.1 Introduction --
7.2 Polynomial Models in One Variable --
7.3 Nonparametric Regression --
7.4 Polynomial Models in Two or More Variables --
7.5 Orthogonal Polynomials. 8. INDICATOR VARIABLES --
8.1 General Concept of Indicator Variables --
8.2 Comments on the Use of Indicator Variables --
8.3 Regression Approach to Analysis of Variance --
9. MULTICOLLINEARITY --
9.1 Introduction --
9.2 Sources of Multicollinearity --
9.3 Effects of Multicollinearity --
9.4 Multicollinearity Diagnostics --
9.5 Methods for Dealing with Multicollinearity --
9.6 Using SAS to Perform Ridge and Principal-Component Regression --
10. VARIABLE SELECTION AND MODEL BUILDING --
10.1 Introduction --
10.2 Computational Techniques for Variable Selection --
10.3 Strategy for Variable Selection and Model Building --
10.4 Case Study: Gorman and Toman Asphalt Data Using SAS --
11. VALIDATION OF REGRESSION MODELS --
11.1 Introduction 372 11.2 Validation Techniques --
11.3 Data from Planned Experiments --
12. INTRODUCTION TO NONLINEAR REGRESSION --
12.1 Linear and Nonlinear Regression Models --
12.2 Origins of Nonlinear Models --
12.3 Nonlinear Least Squares --
12.4 Transformation to a Linear Model --
12.5 Parameter Estimation in a Nonlinear System --
12.6 Statistical Inference in Nonlinear Regression --
12.7 Examples of Nonlinear Regression Models --
12.8 Using SAS and R. 13. GENERALIZED LINEAR MODELS --
13.1 Introduction --
13.2 Logistic Regression Models --
13.3 Poisson Regression --
13.4 The Generalized Linear Model --
14. REGRESSION ANALYSIS OF TIME SERIES DATA --
14.1 Introduction to Regression Models for Time Series Data --
14.2 Detecting Autocorrelation: The Durbin-Watson Test --
14.3 Estimating the Parameters in Time Series Regression Models --
15. OTHER TOPICS IN THE USE OF REGRESSION ANALYSIS --
15.1 Robust Regression --
15.2 Effect of Measurement Errors in the Regressors --
15.3 Inverse Estimation --
The Calibration Problem --
15.4 Bootstrapping in Regression --
15.5 Classifi cation and Regression Trees (CART) --
15.6 Neural Networks --
15.7 Designed Experiments for Regression --
APPENDIX A. STATISTICAL TABLES --
APPENDIX B. DATA SETS FOR EXERCISES --
APPENDIX C. SUPPLEMENTAL TECHNICAL MATERIAL --
C.1 Background on Basic Test Statistics --
C.2 Background from the Theory of Linear Models --
C.3 Important Results on SSR and SSRes --
C.4 Gauss-Markov Theorem, Var(epsilon) = sigma2I. C.5 Computational Aspects of Multiple Regression --
C.6 Result on the Inverse of a Matrix --
C.7 Development of the PRESS Statistic --
C.8 Development of S2 (i) --
C.9 Outlier Test Based on R-Student --
C.10 Independence of Residuals and Fitted Values --
C.11 Gauss --
Markov Theorem, Var(epsilon) = V --
C.12 Bias in MSRes When the Model Is Underspecified --
C.13 Computation of Infl uence Diagnostics --
C.14 Generalized Linear Models --
APPENDIX D. INTRODUCTION TO SAS --
D.1 Basic Data Entry --
D.2 Creating Permanent SAS Data Sets --
D.3 Importing Data from an EXCEL File --
D.4 Output Command --
D.5 Log File --
D.6 Adding Variables to an Existing SAS Data Set --
APPENDIX E. INTRODUCTION TO R TO PERFORM LINEAR REGRESSION ANALYSIS --
E.1 Basic Background on R --
E.2 Basic Data Entry --
E.3 Brief Comments on Other Functionality in R --
E.4 R Commander.
Series Title: Wiley series in probability and statistics.
Responsibility: Douglas C. Montgomery, Elizabeth A. Peck, G. Geoffrey Vining.
More information:

Abstract:

* This Fifth Edition introduces and features the use of R and JMP software. SAS, S-Plus, and Minitab continue to be employed in this new edition, and the output from all of these  Read more...

Reviews

Editorial reviews

Publisher Synopsis

The book can be used for statistics and engineering courses on regression at the upper-undergraduate and graduate levels. It also serves as a resource for professionals in the fields of engineering, Read more...

 
User-contributed reviews
Retrieving GoodReads reviews...
Retrieving DOGObooks reviews...

Tags

Be the first.
Confirm this request

You may have already requested this item. Please select Ok if you would like to proceed with this request anyway.

Linked Data


Primary Entity

<http://www.worldcat.org/oclc/775329531> # Introduction to linear regression analysis
    a schema:CreativeWork, schema:Book ;
    library:oclcnum "775329531" ;
    library:placeOfPublication <http://experiment.worldcat.org/entity/work/data/488891#Place/hoboken_nj> ; # Hoboken, NJ
    library:placeOfPublication <http://id.loc.gov/vocabulary/countries/nju> ;
    schema:about <http://experiment.worldcat.org/entity/work/data/488891#Topic/mathematics> ; # Mathematics
    schema:about <http://experiment.worldcat.org/entity/work/data/488891#Topic/mathematics_probability_&_statistics_general> ; # MATHEMATICS--Probability & Statistics--General
    schema:about <http://experiment.worldcat.org/entity/work/data/488891#Topic/lineare_regression> ; # Lineare Regression
    schema:about <http://id.worldcat.org/fast/1432090> ; # Regression analysis
    schema:about <http://dewey.info/class/519.536/e23/> ;
    schema:bookEdition "5th ed." ;
    schema:bookFormat bgn:PrintBook ;
    schema:contributor <http://viaf.org/viaf/109359839> ; # Elizabeth A. Peck
    schema:contributor <http://viaf.org/viaf/7631185> ; # G. Geoffrey Vining
    schema:creator <http://viaf.org/viaf/51818713> ; # Douglas C. Montgomery
    schema:datePublished "2012" ;
    schema:description "4. MODEL ADEQUACY CHECKING -- 4.1 Introduction -- 4.2 Residual Analysis -- 4.3 PRESS Statistic -- 4.4 Detection and Treatment of Outliers -- 4.5 Lack of Fit of the Regression Model -- 5. TRANSFORMATIONS AND WEIGHTING TO CORRECT MODEL INADEQUACIES -- 5.1 Introduction -- 5.2 Variance-Stabilizing Transformations -- 5.3 Transformations to Linearize the Model -- 5.4 Analytical Methods for Selecting a Transformation -- 5.5 Generalized and Weighted Least Squares -- 5.6 Regression Models with Random Effect -- 6. DIAGNOSTICS FOR LEVERAGE AND INFLUENCE -- 6.1 Importance of Detecting Infl uential Observations -- 6.2 Leverage -- 6.3 Measures of Infl uence: Cook's D -- 6.4 Measures of Infl uence: DFFITS and DFBETAS -- 6.5 A Measure of Model Performance -- 6.6 Detecting Groups of Infl uential Observations -- 6.7 Treatment of Infl uential Observations -- 7. POLYNOMIAL REGRESSION MODELS -- 7.1 Introduction -- 7.2 Polynomial Models in One Variable -- 7.3 Nonparametric Regression -- 7.4 Polynomial Models in Two or More Variables -- 7.5 Orthogonal Polynomials."@en ;
    schema:description ""This book describes both the conventional and less common uses of linear regression in the practical context of today's mathematical and scientific research"--"@en ;
    schema:description "1. INTRODUCTION -- 1.1 Regression and Model Building -- 1.2 Data Collection -- 1.3 Uses of Regression -- 1.4 Role of the Computer -- 2. SIMPLE LINEAR REGRESSION -- 2.1 Simple Linear Regression Model -- 2.2 Least-Squares Estimation of the Parameters -- 2.3 Hypothesis Testing on the Slope and Intercept -- 2.4 Interval Estimation in Simple Linear Regression -- 2.5 Prediction of New Observations -- 2.6 Coefficient of Determination -- 2.7 A Service Industry Application of Regression -- 2.8 Using SAS and R for Simple Linear Regression -- 2.9 Some Considerations in the Use of Regression -- 2.10 Regression Through the Origin -- 2.11 Estimation by Maximum Likelihood -- 2.12 Case Where the Regressor x is Random -- 3. MULTIPLE LINEAR REGRESSION -- 3.1 Multiple Regression Models -- 3.2 Estimation of the Model Parameters -- 3.3 Hypothesis Testing in Multiple Linear Regression -- 3.4 Confidence Intervals in Multiple Regression -- 3.5 Prediction of New Observations -- 3.6 A Multiple Regression Model for the Patient Satisfaction Data -- 3.7 Using SAS and R for Basic Multiple Linear Regression -- 3.8 Hidden Extrapolation in Multiple Regression -- 3.9 Standardized Regression Coeffi cients -- 3.10 Multicollinearity -- 3.11 Why Do Regression Coeffi cients Have the Wrong Sign?"@en ;
    schema:description "8. INDICATOR VARIABLES -- 8.1 General Concept of Indicator Variables -- 8.2 Comments on the Use of Indicator Variables -- 8.3 Regression Approach to Analysis of Variance -- 9. MULTICOLLINEARITY -- 9.1 Introduction -- 9.2 Sources of Multicollinearity -- 9.3 Effects of Multicollinearity -- 9.4 Multicollinearity Diagnostics -- 9.5 Methods for Dealing with Multicollinearity -- 9.6 Using SAS to Perform Ridge and Principal-Component Regression -- 10. VARIABLE SELECTION AND MODEL BUILDING -- 10.1 Introduction -- 10.2 Computational Techniques for Variable Selection -- 10.3 Strategy for Variable Selection and Model Building -- 10.4 Case Study: Gorman and Toman Asphalt Data Using SAS -- 11. VALIDATION OF REGRESSION MODELS -- 11.1 Introduction 372 11.2 Validation Techniques -- 11.3 Data from Planned Experiments -- 12. INTRODUCTION TO NONLINEAR REGRESSION -- 12.1 Linear and Nonlinear Regression Models -- 12.2 Origins of Nonlinear Models -- 12.3 Nonlinear Least Squares -- 12.4 Transformation to a Linear Model -- 12.5 Parameter Estimation in a Nonlinear System -- 12.6 Statistical Inference in Nonlinear Regression -- 12.7 Examples of Nonlinear Regression Models -- 12.8 Using SAS and R."@en ;
    schema:description "C.5 Computational Aspects of Multiple Regression -- C.6 Result on the Inverse of a Matrix -- C.7 Development of the PRESS Statistic -- C.8 Development of S2 (i) -- C.9 Outlier Test Based on R-Student -- C.10 Independence of Residuals and Fitted Values -- C.11 Gauss -- Markov Theorem, Var(epsilon) = V -- C.12 Bias in MSRes When the Model Is Underspecified -- C.13 Computation of Infl uence Diagnostics -- C.14 Generalized Linear Models -- APPENDIX D. INTRODUCTION TO SAS -- D.1 Basic Data Entry -- D.2 Creating Permanent SAS Data Sets -- D.3 Importing Data from an EXCEL File -- D.4 Output Command -- D.5 Log File -- D.6 Adding Variables to an Existing SAS Data Set -- APPENDIX E. INTRODUCTION TO R TO PERFORM LINEAR REGRESSION ANALYSIS -- E.1 Basic Background on R -- E.2 Basic Data Entry -- E.3 Brief Comments on Other Functionality in R -- E.4 R Commander."@en ;
    schema:description "13. GENERALIZED LINEAR MODELS -- 13.1 Introduction -- 13.2 Logistic Regression Models -- 13.3 Poisson Regression -- 13.4 The Generalized Linear Model -- 14. REGRESSION ANALYSIS OF TIME SERIES DATA -- 14.1 Introduction to Regression Models for Time Series Data -- 14.2 Detecting Autocorrelation: The Durbin-Watson Test -- 14.3 Estimating the Parameters in Time Series Regression Models -- 15. OTHER TOPICS IN THE USE OF REGRESSION ANALYSIS -- 15.1 Robust Regression -- 15.2 Effect of Measurement Errors in the Regressors -- 15.3 Inverse Estimation -- The Calibration Problem -- 15.4 Bootstrapping in Regression -- 15.5 Classifi cation and Regression Trees (CART) -- 15.6 Neural Networks -- 15.7 Designed Experiments for Regression -- APPENDIX A. STATISTICAL TABLES -- APPENDIX B. DATA SETS FOR EXERCISES -- APPENDIX C. SUPPLEMENTAL TECHNICAL MATERIAL -- C.1 Background on Basic Test Statistics -- C.2 Background from the Theory of Linear Models -- C.3 Important Results on SSR and SSRes -- C.4 Gauss-Markov Theorem, Var(epsilon) = sigma2I."@en ;
    schema:exampleOfWork <http://worldcat.org/entity/work/id/488891> ;
    schema:image <http://catalogimages.wiley.com/images/db/jimages/9780470542811.jpg> ;
    schema:inLanguage "en" ;
    schema:isPartOf <http://experiment.worldcat.org/entity/work/data/488891#Series/wiley_series_in_probability_and_statistics> ; # Wiley series in probability and statistics.
    schema:name "Introduction to linear regression analysis"@en ;
    schema:productID "775329531" ;
    schema:publication <http://www.worldcat.org/title/-/oclc/775329531#PublicationEvent/hoboken_nj_wiley_2012> ;
    schema:publisher <http://experiment.worldcat.org/entity/work/data/488891#Agent/wiley> ; # Wiley
    schema:url <http://lib.myilibrary.com?id=809573> ;
    schema:url <http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1021709> ;
    schema:workExample <http://worldcat.org/isbn/9780470542811> ;
    schema:workExample <http://worldcat.org/isbn/9781119180173> ;
    umbel:isLike <http://bnb.data.bl.uk/id/resource/GBB0D6081> ;
    wdrs:describedby <http://www.worldcat.org/title/-/oclc/775329531> ;
    .


Related Entities

<http://experiment.worldcat.org/entity/work/data/488891#Series/wiley_series_in_probability_and_statistics> # Wiley series in probability and statistics.
    a bgn:PublicationSeries ;
    schema:hasPart <http://www.worldcat.org/oclc/775329531> ; # Introduction to linear regression analysis
    schema:name "Wiley series in probability and statistics." ;
    schema:name "Wiley series in probability and statistics" ;
    .

<http://experiment.worldcat.org/entity/work/data/488891#Topic/lineare_regression> # Lineare Regression
    a schema:Intangible ;
    schema:name "Lineare Regression"@en ;
    .

<http://experiment.worldcat.org/entity/work/data/488891#Topic/mathematics_probability_&_statistics_general> # MATHEMATICS--Probability & Statistics--General
    a schema:Intangible ;
    schema:name "MATHEMATICS--Probability & Statistics--General"@en ;
    .

<http://id.worldcat.org/fast/1432090> # Regression analysis
    a schema:Intangible ;
    schema:name "Regression analysis"@en ;
    .

<http://lib.myilibrary.com?id=809573>
    rdfs:comment "Rutgers restricted" ;
    rdfs:comment "Full text available from MyiLibrary" ;
    .

<http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1021709>
    rdfs:comment "Rutgers restricted" ;
    rdfs:comment "Full text available from EBSCO PALCI DDA Purchased Titles 2015" ;
    .

<http://viaf.org/viaf/109359839> # Elizabeth A. Peck
    a schema:Person ;
    schema:birthDate "1953" ;
    schema:familyName "Peck" ;
    schema:givenName "Elizabeth A." ;
    schema:name "Elizabeth A. Peck" ;
    .

<http://viaf.org/viaf/51818713> # Douglas C. Montgomery
    a schema:Person ;
    schema:familyName "Montgomery" ;
    schema:givenName "Douglas C." ;
    schema:name "Douglas C. Montgomery" ;
    .

<http://viaf.org/viaf/7631185> # G. Geoffrey Vining
    a schema:Person ;
    schema:birthDate "1954" ;
    schema:familyName "Vining" ;
    schema:givenName "G. Geoffrey" ;
    schema:name "G. Geoffrey Vining" ;
    .

<http://worldcat.org/isbn/9780470542811>
    a schema:ProductModel ;
    schema:isbn "0470542810" ;
    schema:isbn "9780470542811" ;
    .

<http://worldcat.org/isbn/9781119180173>
    a schema:ProductModel ;
    schema:isbn "1119180171" ;
    schema:isbn "9781119180173" ;
    .


Content-negotiable representations

Close Window

Please sign in to WorldCat 

Don't have an account? You can easily create a free account.