Comprehensive Linear Modeling with R provides a wide overview of numerous contemporary linear and non-linear modeling approaches for the analysis of research data. These include basic, conditional and simultaneous inference techniques; analysis of variance (ANOVA); linear regression; survival analysis; generalized linear models (GLMs); parametric and non-parametric smoothers and generalized additive models (GAMs); longitudinal and mixed-effects, split-plot and other nested model designs. The course showcases the use of R Commander in performing these tasks. R Commander is a popular GUI-based "front-end" to the broad range of embedded statistical functionality in R software. R Commander is an 'SPSS-like' GUI that enables the implementation of a large variety of statistical and graphical techniques using both menus and scripts. Please note that the R Commander GUI is written in the RGtk2 R-specific visual language (based on GTK+) which is known to have problems running on a Mac computer.
The course progresses through dozens of statistical techniques by first explaining the concepts and then demonstrating the use of each with concrete examples based on actual studies and research data. Beginning with a quick overview of different graphical plotting techniques, the course then reviews basic approaches to establish inference and conditional inference, followed by a review of analysis of variance (ANOVA). The course then progresses through linear regression and a section on validating linear models. Then generalized linear modeling (GLM) is explained and demonstrated with numerous examples. Also included are sections explaining and demonstrating linear and non-linear models for survival analysis, smoothers and generalized additive models (GAMs), longitudinal models with and without generalized estimating equations (GEE), mixed-effects, split-plot, and nested designs. Also included are detailed examples and explanations of validating linear models using various graphical displays, as well as comparing alternative models to choose the 'best' model. The course concludes with a section on the special considerations and techniques for establishing simultaneous inference in the linear modeling domain.
The rather long course aims for complete coverage of linear (and some non-linear) modeling approaches using R and is suitable for beginning, intermediate and advanced R users who seek to refine these skills. These candidates would include graduate students and/or quantitative and/or data-analytic professionals who perform linear (and non-linear) modeling as part of their professional duties.
Statistical inference is the process of deducing properties of an underlying distribution by analysis of data. Inferential statistical analysis infers properties about a population: this includes testing hypotheses and deriving estimates. The population is assumed to be larger than the observed data set; in other words, the observed data is assumed to be sampled from a larger population.
Analysis of variance (ANOVA) is a collection of statistical models used to analyze the differences among group means and their associated procedures (such as "variation" among and between groups). In the ANOVA setting, the observed variance in a particular variable is partitioned into components attributable to different sources of variation. In its simplest form, ANOVA provides a statistical test of whether or not the means of several groups are equal, and therefore generalizes the t-test to more than two groups.
In statistics, linear regression is an approach for modeling the relationship between a scalar dependent variable y and one or more explanatory variables (or independent variables) denoted X. The case of one explanatory variable is called simple linear regression. For more than one explanatory variable, the process is called multiple linear regression
In statistics, the generalized linear model (GLM) is a flexible generalization of ordinary linear regression that allows for response variables that have error distribution models other than a normal distribution. The GLM generalizes linear regression by allowing the linear model to be related to the response variable via a link function and by allowing the magnitude of the variance of each measurement to be a function of its predicted value.
Survival analysis is a branch of statistics for analyzing the expected duration of time until one or more events happen, such as death in biological organisms and failure in mechanical systems. This topic is called reliability theory or reliability analysis inengineering, duration analysis or duration modelling in economics, and event history analysis in sociology. Survival analysis attempts to answer questions such as: what is the proportion of a population which will survive past a certain time? Of those that survive, at what rate will they die or fail? Can multiple causes of death or failure be taken into account? How do particular circumstances or characteristics increase or decrease the probability of survival?
A smoother is a statistical technique for estimating a real valued function by using its noisy observations, when no parametric model for this function is known. The estimated function is smooth, or non-linear, and the level of smoothness is set by a single parameter.
In statistics, a generalized additive model (GAM) is a generalized linear model in which the linear predictor depends linearly on unknown smooth functions of some predictor variables, and interest focuses on inference about these smooth functions.
Kyphosis (from Greek κυφός kyphos, a hump) refers to the abnormally excessive convex kyphotic curvature of the spine as it occurs in the thoracic and sacral regions. (Inward concave curving of the cervical and lumbar regions of the spine is called lordosis.) Kyphosis can be called roundback or Kelso's hunchback. It can result from degenerative diseases such as arthritis; developmental problems, most commonlyScheuermann's disease; osteoporosis with compression fractures of the vertebra; Multiple myeloma or trauma.
LOESS and LOWESS (locally weighted scatterplot smoothing) are two strongly related non-parametric regression methods that combine multiple regression models in a k-nearest-neighbor-based meta-model. "LOESS" is a later generalization of LOWESS; although it is not a true initialism, it may be understood as standing for "LOcal regrESSion".
LOESS and LOWESS thus build on "classical" methods, such as linear and nonlinear least squares regression.
A mixed model is a statistical model containing both fixed effects and random effects. These models are useful in a wide variety of disciplines in the physical, biological and social sciences. They are particularly useful in settings where repeated measurements are made on the same statistical units (longitudinal study), or where measurements are made on clusters of related statistical units.
In descriptive statistics, a box plot or boxplot is a convenient way of graphically depicting groups of numerical data through their quartiles. Box plots may also have lines extending vertically from the boxes (whiskers) indicating variability outside the upper and lower quartiles, hence the terms box-and-whisker plot and box-and-whisker diagram. Outliers may be plotted as individual points. Box plots are non-parametric: they display variation in samples of a statistical population without making any assumptions of the underlying statistical distribution.
In statistics, a generalized estimating equation (GEE) is used to estimate the parameters of a generalized linear model with a possible unknown correlation between outcomes.
Parameter estimates from the GEE are consistent even when the covariance structure is misspecified, under mild regularity conditions. The focus of the GEE is on estimating the average response over the population ("population-averaged" effects) rather than theregression parameters that would enable prediction of the effect of changing one or more covariates on a given individual. GEEs are usually used in conjunction with Huber–White standard error estimates, also known as "robust standard error" or "sandwich variance" estimates. In the case of a linear model with a working independence variance structure, these are known as "heteroscedasticity consistent standard error" estimators. Indeed, the GEE unified several independent formulations of these standard error estimators in a general framework.
Dr. Geoffrey Hubona held full-time tenure-track, and tenured, assistant and associate professor faculty positions at 3 major state universities in the Eastern United States from 1993-2010. In these positions, he taught dozens of various statistics, business information systems, and computer science courses to undergraduate, master's and Ph.D. students. He earned a Ph.D. in Business Administration (Information Systems and Computer Science) from the University of South Florida (USF) in Tampa, FL; an MA in Economics, also from USF; an MBA in Finance from George Mason University in Fairfax, VA; and a BA in Psychology from the University of Virginia in Charlottesville, VA. He is the founder of the Georgia R School (2010-2014) and of R-Courseware (2014-Present), online educational organizations that teach research methods and quantitative analysis techniques. These research methods techniques include linear and non-linear modeling, multivariate methods, data mining, programming and simulation, and structural equation modeling and partial least squares (PLS) path modeling.