Sample Size Estimates for Well-Powered Cross-Sectional Cortical Thickness Studies

This page contains data that will help researchers plan how many subjects per group need to be included in an MRI-based cortical thickness study to ensure a thickness difference is detected. Cortical thickness mapping and co-registration was carried out using Freesurfer. The power analyses are implemented in the R software package.

If you use this software, please cite this manuscript:

Pardoe, H. R., Abbott, D. F., Jackson, G. D. and The Alzheimer's Disease Neuroimaging Initiative (2013), Sample size estimates for well-powered cross-sectional cortical thickness studies. Human Brain Mapping, 34 (11): 3000 - 3009

Contact: heath.pardoe [at] nyumc.org

Overlay files: Left and right hemisphere Freesurfer surface overlay files that contain a vertex-wise map of the number of subjects required to reliably detect a cortical thickness difference of 0.25 mm when 10 mm FWHM surface smoothing is used, alpha = 0.05, two-sided analysis. The surface files can be overlaid on the fsaverage template. These maps might be useful if you're interested in looking for thickness differences in a particular brain region and want to estimate if you'll need to scan more or less subjects relative to the sample sizes reported in the article.

cortex: A package of functions that will help you calculate how many subjects should be included in your study. These are implemented in the free software package R. Download, then install from within R:
R> install.packages("/path/to/cortex_0.1.tar.gz", repos = NULL, type = "source")

A step by step guide to estimating the number of subjects per group required for a Freesurfer-based well-powered cross-sectional thickness study

Instructions are provided for two circumstances:
(A) You have a preliminary control dataset from the same scanner & acquired using the same acquisition protocol as your study; or 
(B) No control data

A. Preliminary Control Data Available

This approach is recommended because it takes into account any particular idiosyncrasies of your particular cohort/scanner/acquisition protocol. 

1.     Run the standard Freesurfer processing stream on your controls
        (i) Process each subject using Freesurfer: recon-all -i subject.image -subjid subject.name -all
        (ii) Coregister the thickness maps to the fsaverage brain: recon-all -subjid subject.name -qcache

2.     Make a text file with demographic information for vertex-wise correction for age and gender, laid out as a space-delimited file with "id age sex" in the first line and the subject.name, age at time of scan and sex as "M" (male) or "F" (female). Note: It is important that the subject.name is the same as used for the Freesurfer analysis, and that the first line is "id age sex". Otherwise the scripts won't work. For example, make a file called my.demographics.file.dat containing the following:
id age sex
Alessio 19 M
Julia 23 F
Alex 22 M
Fiona 25 F

4. Run the following commands in R.
R> library("cortex")
R> out <- estimate.sample.size.with.prelim.data("/path/to/my.demographics.file.dat", effect.size = 0.25, smoothing = 10, alpha = 0.005, sidedness = "two.sided", absolute = TRUE)
On my desktop machine this process takes about 5 minutes. Type out$n.per.group to get the sample size for your particular dataset. Effect.size is the effect size you want to detect. Smoothing is the smoothing filter you will use for your study. Alpha is the false positive rate, sidedness is "two.sided" or "one.sided", and absolute is "TRUE" if specified the effect size as an absolute thickness difference in mm or "FALSE" if you provided a relative thickness difference as a percentage. The function returns a list containing the following:

out$sample.size.vector.lh    vertex-wise estimates of the number of subjects per group over the left hemisphere
out$sample.size.vector.rh     vertex-wise estimates of the number of subjects per group over the right hemisphere
out$n.per.group    the 95th percentile of the previous two vectors combined.

B. No Control Data Available

This approach provides relatively conservative sample size estimates based on structural scans from our MRI scanner (Siemens TIM-Trio, 0.9 mm isotropic T1-weighted whole brain MPRAGE MRI), and validated on four control datasets from ADNI. If your scan quality is comparable, this technique should provide reasonable estimates of the number of subjects per group required to ensure your study is well powered.

1. In R

R> library("cortex")
R> estimate.sample.size.no.prelim.data(effect.size, smoothing, alpha, sidedness)

where effect size is the thickness difference in mm, smoothing is the size of the smoothing filter, alpha is the false positive rate and sidedness is "two.sided" or "one.sided"

For example
R> estimate.sample.size(effect.size = 0.25,smoothing = 10,alpha = 0.005, sidedness = "two.sided")
[1] 102

tells us that you need to have 102 subjects per group to detect a thickness difference of 0.25 mm with a smoothing filter of 10 mm, a false positive rate of 0.005 and a two-sided test, in order to cover 95% of the cortex.