Book description
This classic text on multiple regression is noted for its nonmathematical, applied, and data-analytic approach. Readers profit from its verbal-conceptual exposition and frequent use of examples.
The applied emphasis provides clear illustrations of the principles and provides worked examples of the types of applications that are possible. Researchers learn how to specify regression models that directly address their research questions. An overview of the fundamental ideas of multiple regression and a review of bivariate correlation and regression and other elementary statistical concepts provide a strong foundation for understanding the rest of the text. The third edition features an increased emphasis on graphics and the use of confidence intervals and effect size measures, and an accompanying website with data for most of the numerical examples along with the computer code for SPSS, SAS, and SYSTAT, at www.psypress.com/9780805822236 .
Applied Multiple Regression serves as both a textbook for graduate students and as a reference tool for researchers in psychology, education, health sciences, communications, business, sociology, political science, anthropology, and economics. An introductory knowledge of statistics is required. Self-standing chapters minimize the need for researchers to refer to previous chapters.
Table of contents
- Cover
- Halftitle
- Title
- Copyright
- Dedication
- Contents
- Preface
-
Chapter 1: Introduction
- 1.1 Multiple Regression/Correlation as a General Data-Analytic System
- 1.2 A Comparison of Multiple Regression/Correlation and Analysis of Variance Approaches
- 1.3 Multiple Regression/Correlation and the Complexity of Behavioral Science
- 1.4 Orientation of the Book
- 1.5 Computation, the Computer, and Numerical Results
- 1.6 The Spectrum of Behavioral Science
- 1.7 Plan for the Book
- 1.8 Summary
-
Chapter 2: Bivariate Correlation and Regression
- 2.1 Tabular and Graphic Representations of Relationships
- 2.2 The Index of Linear Correlation Between Two Variables: The Pearson Product Moment Correlation Coefficient
- 2.3 Alternative Formulas for the Product Moment Correlation Coefficient
- 2.4 Regression Coefficients: Estimating Y From X
- 2.5 Regression Toward the Mean
- 2.6 The Standard Error of Estimate and Measures of the Strength of Association
- 2.7 Summary of Definitions and Interpretations
- 2.8 Statistical Inference With Regression and Correlation Coefficients
- 2.9 Precision and Power
- 2.10 Factors Affecting the Size of r
- 2.11 Summary
-
Chapter 3: Multiple Regression/Correlation With Two or More Independent Variables
- 3.1 Introduction: Regression and Causal Models
- 3.2 Regression With Two Independent Variables
- 3.3 Measures of Association With Two Independent Variables
- 3.4 Patterns of Association Between Y and Two Independent Variables
- 3.5 Multiple Regression/Correlation With k Independent Variables
- 3.6 Statistical Inference With k Independent Variables
- 3.7 Statistical Precision and Power Analysis
- 3.8 Using Multiple Regression Equations in Prediction
- 3.9 Summary
- Chapter 4: Data Visualization, Exploration, and Assumption Checking: Diagnosing and Solving Regression Problems I
-
Chapter 5: Data-Analytic Strategies Using Multiple Regression/Correlation
-
5.1 Research Questions Answered by Correlations and Their Squares
- 5.1.1 Net Contribution to Prediction
- 5.1.2 Indices of Differential Validity
- 5.1.3 Comparisons of Predictive Utility
- 5.1.4 Attribution of a Fraction of the XY Relationship to a Third Variable
- 5.1.5 Which of Two Variables Accounts for More of the XY Relationship?
- 5.1.6 Are the Various Squared Correlations in One Population Different From Those in Another Given the Same Variables?
-
5.2 Research Questions Answered by B Or β
- 5.2.1 Regression Coefficients as Reflections of Causal Effects
- 5.2.2 Alternative Approaches to Making BYX Substantively Meaningful
- 5.2.3 Are the Effects of a Set of Independent Variables on Two Different Outcomes in a Sample Different?
- 5.2.4 What Are the Reciprocal Effects of Two Variables on One Another?
- 5.3 Hierarchical Analysis Variables in Multiple Regression/Correlation
- 5.4 The Analysis of Sets of Independent Variables
- 5.5 Significance Testing for Sets
- 5.6 Power Analysis for Sets
- 5.7 Statistical Inference Strategy in Multiple Regression/Correlation
- 5.8 Summary
-
5.1 Research Questions Answered by Correlations and Their Squares
-
Chapter 6: Quantitative Scales, Curvilinear Relationships, and Transformations
- 6.1 Introduction
-
6.2 Power Polynomials
- 6.2.1 Method
- 6.2.2 An Example: Quadratic Fit
- 6.2.3 Centering Predictors in Polynomial Equations
- 6.2.4 Relationship of Test of Significance of Highest Order Coefficient and Gain in Prediction
- 6.2.5 Interpreting Polynomial Regression Results
- 6.2.6 Another Example: A Cubic Fit
- 6.2.7 Strategy and Limitations
- 6.2.8 More Complex Equations
- 6.3 Orthogonal Polynomials
-
6.4 Nonlinear Transformations
- 6.4.1 Purposes of Transformation and the Nature of Transformations
- 6.4.2 The Conceptual Basis of Transformations and Model Checking Before and After Transformation—Is It Always Ideal to Transform?
- 6.4.3 Logarithms and Exponents; Additive and Proportional Relationships
- 6.4.4 Linearizing Relationships
- 6.4.5 Linearizing Relationships Based on Strong Theoretical Models
- 6.4.6 Linearizing Relationships Based on Weak Theoretical Models
- 6.4.7 Empirically Driven Transformations in the Absence of Strong or Weak Models
- 6.4.8 Empirically Driven Transformation for Linearization: The Ladder of Re-expression and the Bulging Rule
- 6.4.9 Empirically Driven Transformation for Linearization in the Absence of Models: Box-Cox Family of Power Transformations on Y
- 6.4.10 Empirically Driven Transformation for Linearization in the Absence of Models: Box-Tidwell Family of Power Transformations on X
- 6.4.11 Linearization of Relationships With Correlations: Fisher z′ Transform of r
- 6.4.12 Transformations That Linearize Relationships for Counts and Proportions
- 6.4.13 Variance Stabilizing Transformations and Alternatives for Treatment of Heteroscedasticity
- 6.4.14 Transformations to Normalize Variables
- 6.4.15 Diagnostics Following Transformation
- 6.4.16 Measuring and Comparing Model Fit
- 6.4.17 Second-Order Polynomial Numerical Example Revisited
- 6.4.18 When to Transform and the Choice of Transformation
- 6.5 Nonlinear Regression
- 6.6 Nonparametric Regression
- 6.7 Summary
-
Chapter 7: Interactions Among Continuous Variables
- 7.1 Introduction
-
7.2 Centering Predictors and the Interpretation of Regression Coefficients in Equations Containing Interactions
- 7.2.1 Regression with Centered Predictors
- 7.2.2 Relationship Between Regression Coefficients in the Uncentered and Centered Equations
- 7.2.3 Centered Equations With No Interaction
- 7.2.4 Essential Versus Nonessential Multicollinearity
- 7.2.5 Centered Equations With Interactions
- 7.2.6 The Highest Order Interaction in the Centered Versus Uncentered Equation
- 7.2.7 Do Not Center Y
- 7.2.8 A Recommendation for Centering
-
7.3 Simple Regression Equations and Simple Slopes
- 7.3.1 Plotting Interactions
- 7.3.2 Moderator Variables
- 7.3.3 Simple Regression Equations
- 7.3.4 Overall Regression Coefficient and Simple Slope at the Mean
- 7.3.5 Simple Slopes From Uncentered Versus Centered Equations Are Identical
- 7.3.6 Linear by Linear Interactions
- 7.3.7 Interpreting Interactions in Multiple Regression and Analysis of Variance
-
7.4 Post Hoc Probing of Interactions
- 7.4.1 Standard Error of Simple Slopes
- 7.4.2 Equation Dependence of Simple Slopes and Their Standard Errors
- 7.4.3 Tests of Significance of Simple Slopes
- 7.4.4 Confidence Intervals Around Simple Slopes
- 7.4.5 A Numerical Example
- 7.4.6 The Uncentered Regression Equation Revisited
- 7.4.7 First-Order Coefficients in Equations Without and With Interactions
- 7.4.8 Interpretation and the Range of Data
- 7.5 Standardized Estimates for Equations Containing Interactions
- 7.6 Interactions as Partialed Effects: Building Regression Equations With Interactions
- 7.7 Patterns of First-Order and Interactive Effects
- 7.8 Three-Predictor Interactions in Multiple Regression
- 7.9 Curvilinear by Linear Interactions
- 7.10 Interactions Among Sets of Variables
- 7.11 Issues in the Detection of Interactions: Reliability, Predictor Distributions, Model Specification
- 7.12 Summary
-
Chapter 8: Categorical or Nominal Independent Variables
- 8.1 Introduction
-
8.2 Dummy-Variable Coding
- 8.2.1 Coding the Groups
- 8.2.2 Pearson Correlations of Dummy Variables With Y
- 8.2.3 Correlations Among Dummy-Coded Variables
- 8.2.4 Multiple Correlation of the Dummy-Variable Set With Y
- 8.2.5 Regression Coefficients for Dummy Variables
- 8.2.6 Partial and Semipartial Correlations for Dummy Variables
- 8.2.7 Dummy-Variable Multiple Regression/Correlation and One-Way Analysis of Variance
- 8.2.8 A Cautionary Note: Dummy-Variable-Like Coding Systems
- 8.2.9 Dummy-Variable Coding When Groups Are Not Mutually Exclusive
- 8.3 Unweighted Effects Coding
- 8.4 Weighted Effects Coding
- 8.5 Contrast Coding
- 8.6 Nonsense Coding
-
8.7 Coding Schemes in the Context of Other Independent Variables
- 8.7.1 Combining Nominal and Continuous Independent Variables
- 8.7.2 Calculating Adjusted Means for Nominal Independent Variables
- 8.7.3 Adjusted Means for Combinations of Nominal and Quantitative Independent Variables
- 8.7.4 Adjusted Means for More Than Two Groups and Alternative Coding Methods
- 8.7.5 Multiple Regression/Correlation With Nominal Independent Variables and the Analysis of Covariance
- 8.8 Summary
-
Chapter 9: Interactions With Categorical Variables
- 9.1 Nominal Scale by Nominal Scale Interactions
-
9.2 Interactions Involving More Than Two Nominal Scales
- 9.2.1 An Example of Three Nominal Scales Coded by Alternative Methods
- 9.2.2 Interactions Among Nominal Scales in Which Not All Combinations Are Considered
- 9.2.3 What If the Categories for One or More Nominal “Scales” Are Not Mutually Exclusive?
- 9.2.4 Consideration of pr, β, and Variance Proportions for Nominal Scale Interaction Variables
- 9.2.5 Summary of Issues and Recommendations for Interactions Among Nominal Scales
-
9.3 Nominal Scale by Continuous Variable Interactions
- 9.3.1 A Reminder on Centering
- 9.3.2 Interactions of a Continuous Variable With Dummy-Variable Coded Groups
- 9.3.3 Interactions Using Weighted or Unweighted Effects Codes
- 9.3.4 Interactions With a Contrast-Coded Nominal Scale
- 9.3.5 Interactions Coded to Estimate Simple Slopes of Groups
- 9.3.6 Categorical Variable Interactions With Nonlinear Effects of Scaled Independent Variables
- 9.3.7 Interactions of a Scale With Two or More Categorical Variables
- 9.4 Summary
- Chapter 10: Outliers and Multicollinearity: Diagnosing and Solving Regression Problems II
- Chapter 11: Missing Data
-
Chapter 12: Multiple Regression/Correlation and Causal Models
- 12.1 Introduction
- 12.2 Models Without Reciprocal Causation
- 12.3 Models With Reciprocal Causation
- 12.4 Identification and Overidentification
-
12.5 Latent Variable Models
- 12.5.1 An Example of a Latent Variable Model
- 12.5.2 How Latent Variables Are Estimated
- 12.5.3 Fixed and Free Estimates in Latent Variable Models
- 12.5.4 Goodness-of-Fit Tests of Latent Variable Models
- 12.5.5 Latent Variable Models and the Correction for Attenuation
- 12.5.6 Characteristics of Data Sets That Make Latent Variable Analysis the Method of Choice
- 12.6 A Review of Causal Model and Statistical Assumptions
- 12.7 Comparisons of Causal Models
- 12.8 Summary
-
Chapter 13: Alternative Regression Models: Logistic, Poisson Regression, and the Generalized Linear Model
- 13.1 Ordinary Least Squares Regression Revisited
-
13.2 Dichotomous Outcomes and Logistic Regression
- 13.2.1 Extending Linear Regression: The Linear Probability Model and Discriminant Analysis
- 13.2.2 The Nonlinear Transformation From Predictor to Predicted Scores: Probit and Logistic Transformation
- 13.2.3 The Logistic Regression Equation
- 13.2.4 Numerical Example: Three Forms of the Logistic Regression Equation
- 13.2.5 Understanding the Coefficients for the Predictor in Logistic Regression
- 13.2.6 Multiple Logistic Regression
- 13.2.7 Numerical Example
- 13.2.8 Confidence Intervals on Regression Coefficients and Odds Ratios
- 13.2.9 Estimation of the Regression Model: Maximum Likelihood
- 13.2.10 Deviances: Indices of Overall Fit of the Logistic Regression Model
- 13.2.11 Multiple R2 Analogs in Logistic Regression
- 13.2.12 Testing Significance of Overall Model Fit: The Likelihood Ratio Test and the Test of Model Deviance
- 13.2.13 χ2 Test for the Significance of a Single Predictor in a Multiple Logistic Regression Equation
- 13.2.14 Hierarchical Logistic Regression: Likelihood Ratio χ2 Test for the Significance of a Set of Predictors Above and Beyond Another Set
- 13.2.15 Akaike’s Information Criterion and the Bayesian Information Criterion for Model Comparison
- 13.2.16 Some Treachery in Variable Scaling and Interpretation of the Odds Ratio
- 13.2.17 Regression Diagnostics in Logistic Regression
- 13.2.18 Sparseness of Data
- 13.2.19 Classification of Cases
- 13.3 Extensions of Logistic Regression to Multiple Response Categories: Polytomous Logistic Regression and Ordinal Logistic Regression
- 13.4 Models for Count Data: Poisson Regression and Alternatives
- 13.5 Full Circle: Parallels Between Logistic and Poisson Regression, and the Generalized Linear Model
- 13.6 Summary
-
Chapter 14: Random Coefficient Regression and Multilevel Models
- 14.1 Clustering Within Data Sets
- 14.2 Analysis of Clustered Data With Ordinary Least Squares Approaches
- 14.3 The Random Coefficient Regression Model
-
14.4 Random Coefficient Regression Model and Multilevel Data Structure
- 14.4.1 Ordinary Least Squares (Fixed Effects) Regression Revisited
- 14.4.2 Fixed and Random Variables
- 14.4.3 Clustering and Hierarchically Structured Data
- 14.4.4 Structure of the Random Coefficient Regression Model
- 14.4.5 Level 1 Equations
- 14.4.6 Level 2 Equations
- 14.4.7 Mixed Model Equation for Random Coefficient Regression
- 14.4.8 Variance Components—New Parameters in the Multilevel Model
- 14.4.9 Variance Components and Random Coefficient Versus Ordinary Least Squares (Fixed Effects) Regression
- 14.4.10 Parameters of the Random Coefficient Regression Model: Fixed and Random Effects
- 14.5 Numerical Example: Analysis of Clustered Data With Random Coefficient Regression
- 14.6 Clustering as a Meaningful Aspect of the Data
- 14.7 Multilevel Modeling With a Predictor at Level
- 14.8 An Experimental Design as a Multilevel Data Structure: Combining Experimental Manipulation With Individual Differences
- 14.9 Numerical Example: Multilevel Analysis
- 14.10 Estimation of the Multilevel Model Parameters: Fixed Effects, Variance Components, and Level 1 Equations
- 14.11 Statistical Tests in Multilevel Models
- 14.12 Some Model Specification Issues
- 14.13 Statistical Power of Multilevel Models
- 14.14 Choosing Between the Fixed Effects Model and the Random Coefficient Model
- 14.15 Sources on Multilevel Modeling
- 14.16 Multilevel Models Applied to Repeated Measures Data
- 14.17 Summary
-
Chapter 15: Longitudinal Regression Methods
- 15.1 Introduction
- 15.2 Analyses of Two-Time-Point Data
- 15.3 Repeated Measure Analysis of Variance
-
15.4 Multilevel Regression of Individual Changes Over Time
- 15.4.1 Patterns of Individual Change Over Time
- 15.4.2 Adding Other Fixed Predictors to the Model
- 15.4.3 Individual Differences in Variation Around Individual Slopes
- 15.4.4 Alternative Developmental Models and Error Structures
- 15.4.5 Alternative Link Functions for Predicting Y From Time
- 15.4.6 Unbalanced Data: Variable Timing and Missing Data
- 15.5 Latent Growth Models: Structural Equation Model Representation of Multilevel Data
- 15.6 Time Varying Independent Variables
- 15.7 Survival Analysis
- 15.8 Time Series Analysis
- 15.9 Dynamic System Analysis
- 15.10 Statistical Inference and Power Analysis in Longitudinal Analyses
- 15.11 Summary
-
Chapter 16: Multiple Dependent Variables: Set Correlation
- 16.1 Introduction to Ordinary Least Squares Treatment of Multiple Dependent Variables
- 16.2 Measures of Multivariate Association
- 16.3 Partialing in Set Correlation
- 16.4 Tests of Statistical Significance and Statistical Power
- 16.5 Statistical Power Analysis in Set Correlation
- 16.6 Comparison of Set Correlation With Multiple Analysis of Variance
- 16.7 New Analytic Possibilities With Set Correlation
- 16.8 Illustrative Examples
- 16.9 Summary
-
Appendices
- Appendix 1: The Mathematical Basis for Multiple Regression/Correlation and Identification of the Inverse Matrix Elements
- Appendix 2: Determination of the Inverse Matrix and Applications Thereof
-
Appendix Tables
- Table A t Values for α = .01, .05 (Two Tailed)
- Table B z′ Transformation of r
- Table C Normal Distribution
- Table D F Values for α = .01, .05
- Table E L Values for α = .01, .05
- Table F Power of Significance Test of r at α = .01, .05 (Two Tailed)
- Table G n* to Detect r by t Test at α = .01, .05 (Two Tailed)
- References
- Glossary
- Statistical Symbols and Abbreviations
- Author Index
- Subject Index
Product information
- Title: Applied Multiple Regression/Correlation Analysis for the Behavioral Sciences, 3rd Edition
- Author(s):
- Release date: June 2013
- Publisher(s): Routledge
- ISBN: 9781134801015
You might also like
book
Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow, 3rd Edition
Through a recent series of breakthroughs, deep learning has boosted the entire field of machine learning. …
book
Advancing into Analytics
Data analytics may seem daunting, but if you're an experienced Excel user, you have a unique …
book
Python for Data Analysis, 3rd Edition
Get the definitive handbook for manipulating, processing, cleaning, and crunching datasets in Python. Updated for Python …
book
Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow, 2nd Edition
Through a series of recent breakthroughs, deep learning has boosted the entire field of machine learning. …