
Correlation & regression
The following methods treat data
with
heteroscedastic (different for each point) measurement errors which are
commonly present in astronomical data:
Linear
regression with measurement errors and scatter
Weighted ordinary least squares line with heteroscedastic
measurement errors and homoscedastic intrinsic scatter in the dependent
variable. Also includes code in SLOPES.
Developed for astronomy by M.
Akritas (Penn State) & M. Bershady (Wisconsin).
Partial
correlation for censored data
A test for partial correlation between three variables, any or
all of which are subject to censoring, based on a generalized Kendall's
tau. Developed for astronomy by M. Akritas (Penn State) and J.
Siebert (MPI).
Measurement
error linear regression
Three short Fortran programs implementing errorsinvariables
bivariate linear regression (York, Fasano & Vio, Ripley
methods). Developed for astronomy by F.
Murtagh of University of London. (Look under "Various other programs")
ODRPACK
Orthogonal distance nonlinear regression for data weighted by
known measurement errors. By the National Institute Standards
& Technology.
ErrorsinVariables
Model
Least squares linear and nonlinear parameter estimation with
errors in the predictor variables and the dependent variable.
Applied Stat Algorithm #286 distributed by Statlib.
Linear
regression with measurement errors
Code calculationg simultaneous confidence bands for linear
regression with heteroscedastic errors using bootstrap resampling,
based on Faraway & Sun (JASA 1995). Code in LISPSTAT and S+.
SLOPES
Computes ordinary and symmetrical leastsquares regression lines
for bivariate data (orthogonal regression, reduced major axis, OLS
bisector and mean OLS). Developed for astronomy by G. J. Babu
& E. Feigelson of
Penn State.
PROGRESS
Least median of squares regression and least trimmed
squares (LTS) which is highly robust to outliers in the data. By
P. Pousseeuw of University of Antwerp.
Fast
Least Trimmed Squares (LTS)
Robust multivariate regression technique based on the subset of
points whose leastsquares fit gives the smallest sum of squared
residuals. Efficient method for large datasets. By P.
Rousseeuw of University of Antwerp.
NLR
Programs for nonlinear parameter estimation by least squares,
maximumlikelihood and some robust methods. From NIST's GAMS.
Nonlinear
regression
Large Fortran program for maximumlikelihood and quasiML
estimation of parameters in nonlienar regression models. TOMS
Algorithm #717.
Least squares
codes
A extensive collection of Fortran 90 codes for unconstrained
linear and nonlinear leastsquares, ridge regression, fitting ellipses
to (x,y) data, logistic regression, and more. From Alan J. Miller
(CSIRO).
EasyReg
Econometrics package for Windows including: variable
transformations, kernel density estimation, time series analysis
(crosscorrelation, stationarity tests, ARIMA & GARCH modeling),
linear regression models (Poisson regression, Tobit, 2stage least
squares, usersupplied nonlinear), and more by H. Bierens (Penn State).
Nonlinear
Statistical Models
C++ implementation of least squares estimates for univariate and
multivariate nonlinear regression. Associated with the text by A. R.
Gallant (1987).
Regress+
a Macintoshbased program for linear and nonlinear
regression, with bootstrap estimation of errors of parameters and other
options. From causascientia.org.
Generalized
additive models
Generalized additive models fitting a variety of models
(Gaussian, Binomial, Poisson, Gamma, Cox) using cubic smoothing
splines. Distributed by StatLib.
Robust linear
regression
Robust regression by least absolute deviations. Applied
Statistics algorithm #132 distributted by Statlib
Confidence intervals
for nonlinear regression
Generates grid of variance ratios to plot confidence regions for
two parameters using Halperin's method. Applied Statisiticss
algorithm #290 distributed by Statlib.
