**Individual Program Descriptions**

**NOTE:**Clicking on the links below will take you to an online video website.If an error is encountered ("server error" or similar), then place the cursor next to the URL in the web browser and hit "enter". The video will reload and should play.If this doesn't work, then you should be able to access the video from the main website here: Program
Descriptions

**1. What Is Statistics?
**
Using historical anecdotes and contemporary applications, this
introduction to the series explores the vital links between statistics
and our everyday world. The program also covers the evolution of the
discipline.

**
2.
Picturing Distributions
**

With this program, students will see how key characteristics in the
distribution of a histogram — shape, center, and spread — help
professionals make decisions in such diverse fields as meteorology,
television programming, health care, and air traffic control. Through a
discussion of the advantages of back-to-back stem plots, this program
also emphasizes the importance of seeking explanations for gaps and
outliers in small data sets.

**
3.
Describing Distributions
**

This program examines the difference between mean and median, explains
the use of quartiles to describe a distribution, and looks to the use of
boxplots and the five-number summary for comparing and describing data.
An illustrative example shows how a city government used statistical
methods to correct inequity between men’s and women’s salaries.

**
4.
Normal Distributions
**

Students will advance from histograms through smooth curves to normal
curves, and finally to a single normal curve for standardized
measurement, as this program shows ways to describe the shape of a
distribution using progressively simpler methods. In a lesson on
creating a density curve, students also learn why, under steadily
decreasing deviation, today’s baseball players are less likely to
achieve a .400 batting average.

**
5.
Normal Calculations
**

With this program, students will discover how to convert the standard
normal and use the standard deviation; how to use a table of areas to
compute relative frequencies; how to find any percentile; and how a
computer creates a normal quartile plot to determine whether a
distribution is normal. Vehicle emissions standards and medical studies
of cholesterol provide real-life examples.

**
6.
Time Series
**

Statistics can reveal patterns over time. Using the concept of seasonal
variation, this program shows ways to present smooth data and recognize
whether a particular pattern is meaningful. Stock market trends and
sleep cycles are used to explore the topics of deriving a time series
and using the 68-95-99.7 rule to determine the control limits.

**
7.
Models for Growth
**

Topics of this program include linear growth, least squares, exponential
growth, and straightening an exponential growth curve by logic. A study
of growth problems in children serves to illustrate the use of the
logarithm function to transform an exponential pattern into a line. The
program also discusses growth in world oil production over time.

**
8.
Describing Relationships
**

Segments describe how to use a scatterplot to display relationships
between variables. Patterns in variables (positive, negative, and linear
association) and the importance of outliers are discussed. The program
also calculates the least squares regression line of metabolic rate *y*
on lean body mass *x* for a group of subjects and examines the fit
of the regression line by plotting residuals.

**
9.
Correlation
**

With this program, students will learn to derive and interpret the
correlation coefficient using the relationship between a baseball
player’s salary and his home run statistics. Then they will discover how
to use the square of the correlation coefficient to measure the
strength and direction of a relationship between two variables. A study
comparing identical twins raised together and apart illustrates the
concept of correlation.

**
10.
Multidimensional Data Analysis
**

This program reviews the presentation of data analysis through an
examination of computer graphics for statistical analysis at Bell
Communications Research. Students will see how the computer can graph
multivariate data and its various ways of presenting it. The program
concludes with an example of a study that analyzes data on many
variables to get a picture of environmental stresses in the Chesapeake
Bay.

**
11.
The Question of Causation
**

Causation is only one of many possible explanations for an observed
association. This program defines the concepts of common response and
confounding, explains the use of two-way tables of percents to calculate
marginal distribution, uses a segmented bar to show how to visually
compare sets of conditional distributions, and presents a case of
Simpson’s Paradox. The relationship between smoking and lung cancer
provides a clear example.

**
12.
Experimental Design
**

Statistics can be used to evaluate anecdotal evidence. This program
distinguishes between observational studies and experiments and reviews
basic principles of design including comparison, randomization, and
replication. Case material from the Physician’s Health Study on heart
disease demonstrates the advantages of a double-blind experiment.

**
13.
Blocking and Sampling
**

Students learn to draw sound conclusions about a population from a tiny
sample. This program focuses on random sampling and the census as two
ways to obtain reliable information about a population. It covers
single- and multi-factor experiments and the kinds of questions each can
answer, and explores randomized block design through agriculturalists’
efforts to find a better strawberry.

**
14.
Samples and Surveys
**

This program shows how to improve the accuracy of a survey by using
stratified random sampling and how to avoid sampling errors such as
bias. While surveys are becoming increasingly important tools in shaping
public policy, a 1936 Gallup poll provides a striking illustration of
the perils of undercoverage.

**
15.
What Is Probability?
**

Students will learn the distinction between deterministic phenomena and
random sampling. This program introduces the concepts of sample space,
events, and outcomes, and demonstrates how to use them to create a
probability model. A discussion of statistician Persi Diaconis’s work
with probability theory covers many of the central ideas about
randomness and probability.

**
16.
Random Variables
**

This program demonstrates how to determine the probability of any number
of independent events, incorporating many of the same concepts used in
previous programs. An interview with a statistician who helped to
investigate the space shuttle accident shows how probability can be used
to estimate the reliability of equipment.

**
17.
Binomial Distributions
**

This program discusses binomial distribution and the criteria for it,
and describes a simple way to calculate its mean and standard deviation.
An additional feature describes the quincunx, a randomizing device at
the Boston Museum of Science, and explains how it represents the
binomial distribution.

**
18.
The Sample Mean and Control Charts
**

The successes of casino owners and the manufacturing industry are used
to demonstrate the use of the central limit theorem. One example shows
how control charts allow us to effectively monitor random variation in
business and industry. Students will learn how to create x-bar charts
and the definitions of control limits and out-of-control limits.

**
19.
Confidence Intervals
**

This program lays out the parts of the confidence interval and gives an
example of how it is used to measure the accuracy of long-term mean
blood pressure. An example from politics and population surveys shows
how margin of error and confidence levels are interpreted. The program
also explains the use of a formula to convert the *z** values into
values on the sampling distribution curve. Finally, the concepts are
applied to an issue of animal ethics.

**
20.
Significance Tests
**

This program explains the basic reasoning behind tests of significance
and the concept of null hypothesis. The program shows how a *z*-test
is carried out when the hypothesis concerns the mean of a normal
population with known standard deviation. These ideas are explored by
determining whether a poem “fits Shakespeare as well as Shakespeare fits
Shakespeare.” Court battles over discrimination in hiring provide
additional illustration.

**
21.
Inference for One Mean
**

In this program, students discover an improved technique for statistical
problems that involve a population mean: the *t* statistic for use
when σ is not known. Emphasis is on paired samples and the *t*
confidence test and interval. The program covers the precautions
associated with these robust *t* procedures, along with their
distribution characteristics and broad applications.

**
22.
Comparing Two Means
**

How to recognize a two-sample problem and how to distinguish such
problems from one- and paired-sample situations are the subject of this
program. A confidence interval is given for the difference between two
means, using the two-sample *t* statistic with conservative degrees
of freedom.

**
23.
Inference for Proportions
**

This program marks a transition in the series: from a focus on inference
about the mean of a population to exploring inferences about a
different kind of parameter, the proportion or percent of a population
that has a certain characteristic. Students will observe the use of
confidence intervals and tests for comparing proportions applied in
government estimates of unemployment rates.

**
24.
Inference for Two-Way Tables
**

A two-way table of counts displays the relationship between two ways of
classifying people or things. This program concerns inference about
two-way tables, covering use of the chi-square test and null hypothesis
in determining the relationship between two ways of classifying a case.
The methods are used to investigate a possible relationship between a
worker’s gender and the type of job he or she holds.

**
25.
Inference for Relationships
**

With this program, students will understand inference for simple linear
regression, emphasizing slope, and prediction. This unit presents the
two most important kinds of inference: inference about the slope of the
population line and prediction of the response for a given *x*.
Although the formulas are more complicated, the ideas are similar to *t*
procedures for the mean μ of a population.

**
26.
Case Study
**

This program presents a detailed case study of statistics at work.
Operating in a real-world setting, the program traces the practice of
statistics — planning the data collection, collecting and picturing the
data, drawing inferences from the data, and deciding how confident we
can be about our conclusions. Students will begin to see the full range
and power of the concepts and techniques they have learned.