Home‎ > ‎Statistics‎ > ‎SPSS‎ > ‎My SPSS Page‎ > ‎


SPSS and SAS programs for comparing Pearson
correlations and OLS regression coefficients

This page has SPSS syntax files and associated output for the methods described in the Behavior Research Methods article by Weaver & Wuensch.


We thank Ray Koopman for noticing that there was a problem with the original version of our t-test for comparing two independent ordinary least squares (OLS) regression coefficients. 
Ray also noticed that we had not implemented Steiger's (1980) adjustment when computing the standard errors for the PF and ZPF tests.  The Errata for our article can be downloaded here.  The details of the corrections are also summarized below.

Problem with t-test for comparing two OLS regression coefficients

We computed the standard error of the difference between the two coefficients using a method that does not assume equal variances.  Therefore, we ought to have used Satterthwaite degrees of freedom (df), as is done when using the unequal variances version of the t-test for comparing two means.  We have modified our code to use the correct
df for that t-test.  Our revised code also computes the pooled variance version of the same t-test. 
Users can indicate which version of the test they want by setting input variable Pool = 1 (for the pooled variance test) or Pool = 0 (for the unequal variances test).  Note that the pooled variance test is the one that corresponds to Potthoff analysis, which can be carried out if one has the raw data. 

Steiger's adjustment when computing PF and ZPF

Steiger's adjustment consists of replacing both r12 and r34 with Mean(r12,r34) in the equations for their respective standard errors, including the computation of k (see equations 18 and 19 in the original article).  Therefore, we have also modified our code for PF and ZPF to compute both Steiger's modified versions and the original versions of those tests.  Users can indicate which one they want by setting input variable Steiger = 1 (for Steiger's modified versions) or Steiger = 0 (for the original versions).

Fisher's r-to-z

Our article shows Fisher's r-to-z transformation in Equation 2.  Note that the absolute value function we included is not necessary, because (1+r)/(1-r) cannot be negative.  And finally, as Ray Koopman (personal communication) noted,
Fisher's r-to-z "is better known in the wider world as the hyperbolic arctangent (aka arctanh)".  So if one is using software that has an arctanh function, it can be used in place of our Equation 2.

SPSS Syntax Files to Perform the Analyses

NOTE (30-Apr-2014):  When these syntax files were first developed, they all ran perfectly with no errors.  But when I recently attempted to run some of them using a newer version of IBM-SPSS (SPSS, 64-bit version under Windows 7 Professional, SP1), some COMPUTE lines involving scratch variables (e.g., #tneg) caused an errors.  (For more details see this thread from the SPSSX-L mailing list.)  Therefore, I have uploaded revised versions of the affected files that eliminate the scratch variables that are causing the problems.  (Syntax files downloaded via the journal website remain the original versions that contain the potentially problematic scratch variables.) 

To facilitate opening the SPSS syntax files in a web-browser, they are stored as text files (.txt) rather than as SPSS syntax files (.sps).  When viewing a text file in your browser, you can use Save-As to save it to your local computer.  While saving it, or after the fact using a file manager, you can change the extension from .txt to .sps.
  1. Test value of rho  -- for each test, enter r, rho, n, alpha and note, then run the syntax.
  2. Test intercept and/or slope -- for each test, enter value of estimator (intercept or slope), corresponding parameter under a true null, standard error, df, alpha, and note.  Run. 
  3. Compare two independent correlation coefficients -- enter r1, r2, n1, n2, alpha, and note.  Run.
  4. Compare two independent regression coefficients -- enter number of predictors, both slopes or intercepts, both sample sizes, alpha, and note.  Run.
  5. Compare two or more independent regression coefficients --  for each group, enter group number, coefficient, standard error, sample size, and note.  Run.  Note that this procedure differs from the others in that you need a different data set (created via the DATA LIST command) for each set of coefficients you wish to test.  (For the other procedures, all of the information for a given test is contained within a single row of data, so multiple tests can be done within a single data set, one test per row.)
    • NOTE that syntax file 5 includes an example of how to compare two or more independent correlations.  Scroll down to the section that starts with *** Test equivalence of correlations ***. 
  6. Compare two non-independent correlations with one variable in common --enter r12, r13, r23, n, alpha, and note.  Run.
  7. Compare two non-independent correlations with no variables in common -- enter n, r12, r13, r14, r23, r24, r34, alpha, and note.  Run.
  8. Test Hypotheses of Coincidence, Equal Slopes, and Equal Intercepts Using Raw Data, k independent groups --  Tinker with code.  Submit.  (NOTE:  You must edit the GET FILE command to point correctly to the lung.sav data file on your computer.)
  9. Run them all -- edit the GET FILE command to point correctly to location of lung.sav data file and then run.
  10. Syntax with associated output from all of them, annotated a bit.

SAS Programs to Perform the Analyses

SAS users can find the corresponding SAS files on Karl Wuensch's website.

The Raw Data

The lung data used by some of these syntax files and SAS programs can be downloaded from the UCLA Statistical Computing website.  (If that page doesn't work, try this one.)

An Online Calculator

A nice online calculator for comparing correlations can be found here.  It uses the cocor package for the R programming language.

Page last modified on 23-Mar-2015