September 6, 2019
For today
Read Think Python 2e Appendix B, just section B.1 (no reading quiz)
Read Think Complexity 2e Chapter 2 and do the reading quiz
Today
Complexity workshop part 3 (Link to slides)
Chapter 2 notebook
Chapter 1 review
Chapter 2 review (probably not today)
For next time
Read Think Python 2e Appendix B, just section B.2 (no reading quiz)
Turn Notebook 02 in (see instructions below)
Prepare for a quiz on the workshop and Chapters 1-2
Watch Ned Batchelder's talk "Loop like a native".
Optional reading: The Inspection Paradox is Everywhere
Note: The workshop goes fast and doesn't explain everything. Don't worry! We'll come back and do it all more carefully.
The only prereq for this class is SoftDes. I will not assume you know anything other than what's in SoftDes (and I won't even assume you remember it all).
For each chapter, there is a corresponding notebook. After reading each chapter, you will work on the notebook, do the exercises, and turn it in.
Here are the steps:
1) In order to minimize Git merge conflicts, you should avoid modifying the notebooks I provide. Before you open each notebook, make a copy:
cp chap02.ipynb chap02mine.ipynb
Or you can open chap02.ipynb in Jupyter and make a copy there.
2) Then open the copy, do the exercises, and save your changes.
3) Use Git to push the modified notebook to your repository:
git add chap02mine.ipynb
git commit -m "Adding chap02mine.ipynb"
git push
4) Go to GitHub, view the notebook you just pushed, and copy the URL, which will be something like
https://github.com/YourGitHubUserName/ThinkComplexity2/blob/master/code/chap02.ipynb
5) Go to Notebook 02 on Canvas and paste in the URL of your notebook.
The thesis of the book, and this class, is that we are in the middle of a quiet revolution.
What’s considered a good model.
What’s a satisfactory explanation.
What we mean by “science”.
What are the interesting questions.
In the book, I refer to an example from Strogatz, Sync:
I repeated the simulation dozens of times, for other random initial conditions and for other numbers of oscillators. Sync every time. [...] The challenge now was to prove it. Only an ironclad proof would demonstrate, in a way that no computer ever could, that sync was inevitable; and the best kind of proof would clarify why it was inevitable.
I understand the appeal of mathematical proof, up to a point. But IMO, Strogatz focuses on
1) Assuming that all fireflies are the same, and that they can all see each other, prove that they always synchronize (any number, any initial condition).
And ignores what I think is a more interesting question:
2) How do the fireflies synchronize despite the fact that they are not identical, and cannot all see each other? And how far can we push that before it breaks?
To answer those questions, we need things like
1) A network model to represent which fireflies can see each other, and
2) An agent-based model that includes differences between agents.
And that's what this class is about.
In the workshop notes, I outline the ways I think science is changing. We'll look at them now and review them later, when they will make more sense.
Finally, I mention Thomas Kuhn and the idea of a paradigm shift. We'll come back to this later and read one of Kuhn's articles.
Graphs (directed and undirected), nodes, edges, paths.
Random graphs, the ER model.
Connected and completely connected graphs.
Generator functions.
A few ways we could write all_pairs
def all_pairs(nodes):
t = []
for i, u in enumerate(nodes):
for j, v in enumerate(nodes):
if i>j:
t.append((u, v))
return t
def all_pairs(nodes):
t = [(u, v) for i, u in enumerate(nodes)
for j, v in enumerate(nodes)
if i>j]
return t
def all_pairs(nodes):
for i, u in enumerate(nodes):
for j, v in enumerate(nodes):
if i>j:
yield u, v
The last version uses yield, so it is a generator function.
1) When called, it returns an iterable object; that is, an object that provides a __next__ method.
2) The first time __next__ is invoked, the body of the function gets executed until it gets to the yield statement.
3) The next time __next__ is invoked, the function resumes execution from where it left off!
4) The last time it's invoked, flow reaches the end of the function, which indicates that we're done.
But usually you don't see any of that explicitly. You can do things like this:
def make_random_graph(n, p):
G = nx.Graph()
nodes = range(n)
G.add_nodes_from(nodes)
G.add_edges_from(random_pairs(nodes, p))
return G
For more on generator functions, see Ned Batchelder's talk "Loop like a native".
Also in Chapter 2: Python sets and lists.