Lecture notes‎ > ‎

Lecture 09

For today you should have:

  1. Read Chapter 7.
  2. Homework 6.

Today:

  1. Convolution, including numerous examples of practical use.
  2. Renal tumors.
  3. Central limit theorem.
  4. Abstraction and complexity.
  5. Project time.

For next time:

  1. Homework 7.
  2. Draft report.

Draft report

1) Turning in what you have, on time, is better than turning in something more complete, late.
2) Start with the simple stuff: one variable at a time, just characterize.
3) Don't sweep the anomalies!
4) Question, method, result, conclusion, repeat.
5) Remember the fundamentals: good presentation of numbers, tables and graphs.
6) Negative results are fine as long as the questions were reasonable.

Review the project page and my "suggestions."


Homework 6


1) What happens if you convolve samples from different distributions?

Convolutions of exponential, normal, lognormal and Pareto distributions.


2) What happens if you add up a large number of independent random variates?

The Central Limit Theorem.  Does it always work?  How long does it take?

3) What happens if instead of adding up a bunch of independent random variables, we multiply them together?

Abstraction and complexity

Human brain has basic limits, like working memory capacity "seven plus or minus two."

This limits our ability to manage complexity, which is why we can do only simple arithmetic in our heads.

We have tools for expanding our natural ability, like writing.  And chunking:

In cognitive psychology and mnemonicschunking refers to a strategy for making more efficient use of short-term memory by recoding information. More generally, Herbert Simon has used the term chunk to indicate long-term memory structures that can be used as units of perception and meaning, and chunking as the learning mechanisms leading to the acquisition of these chunks.

Abstraction is either another mechanism for managing complexity or maybe a form of chunking.

In the context of modeling, abstraction is when we replace complex things with simple things that include only the features we think are relevant.

In the context of software, abstraction is when we forget about the implementation of a function/object and just think about the interface.

In the context of mathematics, abstraction often comes in the form of an algebra.

An algebra is a set of abstract entities and operations.

In elementary algebra, the entities are numbers and the operations are arithmetic.

In linear algebra, the entities are matrices.  Some of the operations are arithmetic and some are new (like transpose).

The power of algebra becomes apparent only when the problems we are working with exceed our working memory capacity.

Example: how I (and probably you) used to do algebra.

Example: how I used to do linear algebra.

One symptom is "complexity collapse," which is when we can't solve problems just because they exceed our cognitive limits.

Example: structural analysis and structural engineering.  From

                TO                

Proposition: mathematical statistics is distribution algebra.

What does that mean?  What are the entities and what are the operations?

So far, we know how to make distributions, change from one representation to another, and pull out values.

That's like linear algebra where all you know how to do is make a matrix and look up an element.

The goal is to start chunking.

Comments