"Too Big to Know" Kirby highlights
In the United States, 275,232 books were published in 2008, a thirty-fold increase in volume from 1900.30 But it’s highly unlikely that your local library got hundreds of times bigger during those past 110 years to accommodate that growth curve. Instead, your library adopted the only realistic tactic, each year ignoring a higher and higher percentage of the available volumes. The filters your town used kept the enormous growth in book-based knowledge out of sight. As a result, library users’ experience of the amount of available knowledge didn’t keep up with its actual growth. But on the Net, search engines answer even our simplest questions with more results than the total number of books in our local library. Every link we see now leads to another set of links in a multi-exponential cascade that fans out from wherever we happen to be standing. Google lists over 3 million hits on the phrase “information overload.”31 There was always too much to know, but now that fact is thrown in our faces at every turn. Now we know that there’s too much for us to know. And that has consequences.Read more at location 333
First, it’s unavoidably obvious that our old institutions are not up to the task because the task is just too large: How many people would you have to put on your library’s Acquisitions Committee to filter the Web’s trillion pages? We need new filtering techniques that don’t rely on forcing the ocean of information through one little kitchen strainer. The most successful so far use some form of social filtering, relying upon the explicit or implicit choices our social networks make as a guide to what will be most useful and interesting for us. These range from Facebook’s simple “Like” button (or Google’s “+1” button) that enables yourRead more at location 342
friends to alert you to items they recommend, to personalized searches performed by Bing based on information about you on Facebook, to Amazon’s complex algorithms for recommending books based on how your behavior on its site matches the patterns created by everyone else’s behavior.Read more at location 346
Second, the abundance revealed to us by our every encounter with the Net tells us that no filter, no matter how social and newfangled, is going to reveal the complete set of knowledge that we need. There’s just too much good stuff.Read more at location 348
Third, there’s also way too much bad stuff. We can now see every idiotic idea put forward seriously and every serious idea treated idiotically. What we make of this is, of course, up to us, but it’s hard to avoid at least some level of despair as the traditional authorities lose their grip and before new tools and types of authority have fully settled in. The Internet may not be making me and you stupid, but itRead more at location 350
sure looks like it’s making a whole bunch of other people stupid.Read more at location 353
Fourth, we can see—or at least are led to suspect—that every idea is contradicted somewhere on the Web. We are never all going to agree, even when agreement is widespread, except perhaps on some of the least interesting facts. Just as information overload has become a fact of our environment, so is the fact of perpetual disagreement. We may also conclude that even the ideas we ourselves hold most firmly are subject to debate, although there’s evidence (which we will consider later) that the Net may be driving us to hold to our positions more tightly.Read more at location 353
Sixth, filters are particularly crucial content. The information that the filters add—“These are the important pages if you’re studying hypercomputation and cognitive science”—is itself publicly available and may get linked up with other pages and other filters. The result of the new filtering to the front is an increasingly smart network, with more and more hooks and ties by which we can find our way through it and make sense of what we find.Read more at location 363
The New Institution of KnowledgeRead more at location 369
Wide.Read more at location 372
Boundary-free.Read more at location 375
Populist.Read more at location 379
“Other”-credentialed.Read more at location 381
Unsettled. We used to rely on experts to have decisive answers. It is thus surprising that in some branches ofRead more at location 385
biology, rather than arguing to a conclusion about how to classify organisms, a new strategy has emerged to enable scientists to make progress together even while in fundamental disagreement.Read more at location 385
White House and the American Association for the Advancement of Science—recognize that traditional ways of channeling and deploying expertise are insufficient to meet today’s challenges. Both agree that the old systems of credentialing authorities are too slow and leave too much talent outside the conversation. Both see that there are times when the rapid development of ideas is preferable to careful and certain development. Both acknowledge that there is value in disagreement and in explorations that may not result in consensus. Both agree that there can be value in building a loose network that iterates on the problem, and from which ideas emerge. In short, Expert Labs is a conscious response to the fact that knowledge has rapidly gotten too big for its old containers. . . . 35Read more at location 419
Cass Canfield of Harper’s was approached one day in his editorial sanctum by a sweet-faced but determined matron who wanted very much to discuss a first novel on which she was working. “How long should a novel be?” she demanded. “That’s an impossible question to answer,” explained Canfield. “SomeRead more at location 441
novels, like Ethan Frome, are only about 40,000 words long. Others, Gone with the Wind, for instance, may run to 300,000.” “But what is the average length of the ordinary novel?” the lady persisted. “Oh, I’d say about 80,000 words,” said Canfield. The lady jumped to her feet with a cry of triumph. “Thank God!” she cried. “My book is finished!”Read more at location 443
When Data.gov launched, it had only 47 datasets. Nine months later, there were 168,000,37 and there had been 64 million hits on the site.38Read more at location 728
Obama’s executive order intended to establish—to use a software industry term—a new default. A software default is the configuration of options with which software ships; the user has to take special steps to change them, even if those steps are as easy as clicking on a check box. Defaults are crucial because they determine the user’s first experience of the software: Get the defaults wrong, and you’ll lose a lot of customers who can’t be bothered to change their preferences, or who don’t know that a particular option is open to them. But defaults are even more important asRead more at location 730
symbols indicating what the software really is and how it is supposed to work. In the case of Microsoft Word, writing multi-page, text-based documents, and not posters or brochures, is the default. The default for Ritz crackers, as depicted on the front of the box, is that they’re meant to be eaten by themselves or with cheese.39Read more at location 734
We know that there could be another hundred, thousand, or ten thousand columns of data, and reality would still outrun our spreadsheet. The unimaginably large fields of data at Data.gov—we are back to measuring stacked War and Peaces—do not feel like they’re getting us appreciably closer to having a complete picture of the world. Their magnitude is itself an argument against any such possibility.Read more at location 748
Data.gov and FuelEconomy.gov are not parliamentary blue books. They are not trying to nail down a conclusion. Data.gov and the equivalents it has spurred in governments around the world, the massive databases of economic information released by the World Bank, the entire human genome, the maps of billions of stars, the full text of over 10 million books made accessible by Google Books, the attempts to catalog all Earth species, all of these are part of the great unnailing: the making accessible of vastRead more at location 751
quantities of facts as a research resource for anyone, without regard to point of view or purpose. These open aggregations are often now referred to as “data commons,” and they are becoming the default for data that has no particular reason to be kept secret.Read more at location 755
As news spreads from person to person, it sprays out across far wider networks. This is vital because, as Lakhani’s study of InnoCentive discovered, “the further the problem was from the solvers’ expertise, the more likely they were to solve it.”28Read more at location 1040
software fix but because by then the network will have gotten sufficiently expert in the workarounds.Read more at location 1110
Note: sorry to share so much it seems the best way to accumulate the highlights I want is there a good alternative? Edit
The Net becomes more of an expert not just from the content people create for it but also from the links they—we—draw among the pieces. Linking curates the Net. Yet links are content, too. Indeed, one important type of expertise is being able to run the maze of links. The accumulation of links makes the accumulation of content on the Net ever more usable (because it can be found) and valuable (because a context grows around each piece of content).Read more at location 1111
Note: thus an assistant professor who makes links used and cited often should get tenure Edit
context and richness.Read more at location 1189
Note: not shared from on Edit
“True enough, we have internal experts we can draw on,” says Cenkl, “but we’ve also realized that we need to change the way we present ourselves. It’s not necessarily that the MITRE person is the smartest person in the room. We’ve decided that the model needs to evolve so that we become the brokers of expertise. Our value is that we understand the government’s problems really, really well and we can bring the entire community to bear.”Read more at location 1190
In fact, “[t]he forums don’t necessarily come to consensus. There’s a desire to use all the expertise available but not a pressure to drive to the right answer for all circumstances,” adds Les Holtzblatt, chief Work Practices architect.38 Why? Because networks of interacting experts are smarter than the accumulation of individual expert opinions whether we’re using a simple mailing list or a more highly structured knowledge management community. This is a fundamental distinction not only in the way expertise is derived but also in its nature: MITRE, which is in the expertise business, finds it often delivers more value to its clients when it involves them in a network of experts who have differing opinions.Read more at location 1197
Expertise was topic-based. Books focus on specific topics because they have to fit between covers. So, in a book-based world, knowledge looks like something that divides into masterable domains. On the Net, topics don’t divide up neatly. They connect messily. While people of course still develop deep expertise, the networking of those experts better reflects the overall truth that topic boundaries are often the result of the boundaries of paper.Read more at location 1204
Expertise’s value was the certainty of its conclusions. Books get to speak once. After they’re published, it’s expensive for the authors to change their minds. So, books try to nail things down.Read more at location 1208
But because the multitude of people on the Internet are different in their interests and abilities, a network of experts is of many minds about just about everything. The value of a network of experts can be in opening things up, not simply coming to unshakable conclusions.Read more at location 1209
Expertise was often opaque. While experts’ reports usually tell us how their conclusions were derived, and typically include supporting data, we don’t expect to be able to go back very far in the experts’ thinking: The reports and their included data have been our stopping points. Networked expertise puts in links to sources—and even to contradictions—as a matter of course. A simple search is likely to turn up contextualizing information about the expert and about the information the expert is relying on.Read more at location 1211
Expertise was one-way. Books areRead more at location 1215
the original form of broadcasting, a one-to-many medium: The reader can write a big red “No!!!” in the margins, but the author will never know about it. The Net, on the other hand, is multi-way. Any expert who thinks she will talk and we will simply listen has underestimated the Net. We will comment on her site, and if she doesn’t permit comments, we will angrily note that fact on blogs, on Twitter, on Facebook. This multi-way interactivity can make a network of experts more creative, and more responsive to the multitude of ideas and opinions in the world. It can, of course, also create and propagate misinterpretations of the expert’s ideas.Read more at location 1215
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 82 | Loc. 1474-76 | Added on Friday, May 18, 2012, 10:59 PM
First, the members of the group have a smaller pool of views from which to drink. Second, because people “want to be perceived favorably by other group members,” they will often adjust their views toward the dominant position. “In countless studies, exactly this pattern is observed.”
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 83 | Loc. 1482-84 | Added on Saturday, May 19, 2012, 09:14 AM
“cybercascades” in which a belief rapidly gains many believers because it is being passed around the Net as true. Plus, “[a] number of studies have shown group polarization in Internet-like settings.” 29
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 83 | Loc. 1497-98 | Added on Saturday, May 19, 2012, 09:17 AM
That is, those visiting the most obvious examples of partisan echo chambers are also more likely than most people to visit sites on the other side of the political divide.
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 87 | Loc. 1560-62 | Added on Saturday, May 19, 2012, 09:31 AM
so there’s not enough common ground to even begin a discussion. The only place we have the sort of rational discussions Gore, Sunstein, and Socrates value so highly is within an echo chamber—a room in which people agree thoroughly enough that they can disagree reasonably.
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 88 | Loc. 1572-75 | Added on Saturday, May 19, 2012, 11:46 AM
It’s important to be clear about this. We still need as much difference and diversity within the conversation as we can manage. We still need to continuously learn how to manage to include more diversity. We need to be on guard against the psychological tricks echo chambers play, convincing us that our beliefs are “obviously” true and nudging us toward more extreme versions of them. But it’s also fine for the
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 89 | Loc. 1588-91 | Added on Saturday, May 19, 2012, 11:48 AM
As a new member getting to hang out with the scholars whose work had guided me, I was struck by the fact that the sorts of things they were saying about Derrida were precisely the sorts of things non-Heideggerians said about Heidegger: He was out to shock, incoherent, purposefully vague, an intellectual charlatan. We were, in short, having a classic echo chamber moment, refusing to take seriously claims that challenged our own. The irony is, of course, that the rest of the world would consider Derrida and Heidegger to be overwhelmingly alike in their ideas.
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 89 | Loc. 1596-98 | Added on Saturday, May 19, 2012, 11:50 AM
All knowledge and experience is an interpretation. The world is one way and not others—the stone you stubbed your toe on is really there, and polio vaccine works quite reliably—but our experience of the world is always from a point of view, looking at some features and not others.
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 90 | Loc. 1598-1600 | Added on Saturday, May 19, 2012, 11:51 AM
Interpretations are social. Interpretation always occurs within a culture, a language, a history, and a human project we care about. The tree is lumber to the woodcutter, a place to climb to the child, and an object of worship to the Druid. This inevitably adds human elements of uncertainty and incompleteness.
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 90 | Loc. 1601-4 | Added on Saturday, May 19, 2012, 11:52 AM
There is no privileged position. There are always many ways to interpret anything, and none can claim to be the single best way out of its context. Some postmodernists talk about this in terms of denying that there are “privileged” positions, intentionally invoking not only the Einsteinian sense (all motion is relative) but also, pointedly, the socioeconomic sense (the elite should not get to marginalize the ideas of the rest).
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 90 | Loc. 1604-8 | Added on Saturday, May 19, 2012, 11:54 AM
Interpretations occur in discourses. You can’t make sense of something outside of a context. Even something as simple as a car’s turn signal can only be understood within a context that includes cars, the basics of physics, the unpredictable intentions of other drivers, the restrictions of law, and the way left and right travels with one’s body. Ludwig Wittgenstein talked about this in terms of “language games,” by which he meant not something you do for fun but, rather, the way our words and actions are guided by implicit rules and expectations. Postmodernists have many different words for these contexts, but we’ll use the term “discourses.”
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 90 | Loc. 1611-12 | Added on Saturday, May 19, 2012, 11:59 AM
themselves social constructions—they are ways people within a culture put ideas together. They are not themselves part of nature, and they change throughout history.
==========
In the United States, 275,232 books were published in 2008, a thirty-fold increase in volume from 1900.30 But it’s highly unlikely that your local library got hundreds of times bigger during those past 110 years to accommodate that growth curve. Instead, your library adopted the only realistic tactic, each year ignoring a higher and higher percentage of the available volumes. The filters your town used kept the enormous growth in book-based knowledge out of sight. As a result, library users’ experience of the amount of available knowledge didn’t keep up with its actual growth. But on the Net, search engines answer even our simplest questions with more results than the total number of books in our local library. Every link we see now leads to another set of links in a multi-exponential cascade that fans out from wherever we happen to be standing. Google lists over 3 million hits on the phrase “information overload.”31 There was always too much to know, but now that fact is thrown in our faces at every turn. Now we know that there’s too much for us to know. And that has consequences.Read more at location 333
First, it’s unavoidably obvious that our old institutions are not up to the task because the task is just too large: How many people would you have to put on your library’s Acquisitions Committee to filter the Web’s trillion pages? We need new filtering techniques that don’t rely on forcing the ocean of information through one little kitchen strainer. The most successful so far use some form of social filtering, relying upon the explicit or implicit choices our social networks make as a guide to what will be most useful and interesting for us. These range from Facebook’s simple “Like” button (or Google’s “+1” button) that enables yourRead more at location 342
friends to alert you to items they recommend, to personalized searches performed by Bing based on information about you on Facebook, to Amazon’s complex algorithms for recommending books based on how your behavior on its site matches the patterns created by everyone else’s behavior.Read more at location 346
Second, the abundance revealed to us by our every encounter with the Net tells us that no filter, no matter how social and newfangled, is going to reveal the complete set of knowledge that we need. There’s just too much good stuff.Read more at location 348
Third, there’s also way too much bad stuff. We can now see every idiotic idea put forward seriously and every serious idea treated idiotically. What we make of this is, of course, up to us, but it’s hard to avoid at least some level of despair as the traditional authorities lose their grip and before new tools and types of authority have fully settled in. The Internet may not be making me and you stupid, but itRead more at location 350
sure looks like it’s making a whole bunch of other people stupid.Read more at location 353
Fourth, we can see—or at least are led to suspect—that every idea is contradicted somewhere on the Web. We are never all going to agree, even when agreement is widespread, except perhaps on some of the least interesting facts. Just as information overload has become a fact of our environment, so is the fact of perpetual disagreement. We may also conclude that even the ideas we ourselves hold most firmly are subject to debate, although there’s evidence (which we will consider later) that the Net may be driving us to hold to our positions more tightly.Read more at location 353
Sixth, filters are particularly crucial content. The information that the filters add—“These are the important pages if you’re studying hypercomputation and cognitive science”—is itself publicly available and may get linked up with other pages and other filters. The result of the new filtering to the front is an increasingly smart network, with more and more hooks and ties by which we can find our way through it and make sense of what we find.Read more at location 363
The New Institution of KnowledgeRead more at location 369
Wide.Read more at location 372
Boundary-free.Read more at location 375
Populist.Read more at location 379
“Other”-credentialed.Read more at location 381
Unsettled. We used to rely on experts to have decisive answers. It is thus surprising that in some branches ofRead more at location 385
biology, rather than arguing to a conclusion about how to classify organisms, a new strategy has emerged to enable scientists to make progress together even while in fundamental disagreement.Read more at location 385
White House and the American Association for the Advancement of Science—recognize that traditional ways of channeling and deploying expertise are insufficient to meet today’s challenges. Both agree that the old systems of credentialing authorities are too slow and leave too much talent outside the conversation. Both see that there are times when the rapid development of ideas is preferable to careful and certain development. Both acknowledge that there is value in disagreement and in explorations that may not result in consensus. Both agree that there can be value in building a loose network that iterates on the problem, and from which ideas emerge. In short, Expert Labs is a conscious response to the fact that knowledge has rapidly gotten too big for its old containers. . . . 35Read more at location 419
Cass Canfield of Harper’s was approached one day in his editorial sanctum by a sweet-faced but determined matron who wanted very much to discuss a first novel on which she was working. “How long should a novel be?” she demanded. “That’s an impossible question to answer,” explained Canfield. “SomeRead more at location 441
novels, like Ethan Frome, are only about 40,000 words long. Others, Gone with the Wind, for instance, may run to 300,000.” “But what is the average length of the ordinary novel?” the lady persisted. “Oh, I’d say about 80,000 words,” said Canfield. The lady jumped to her feet with a cry of triumph. “Thank God!” she cried. “My book is finished!”Read more at location 443
When Data.gov launched, it had only 47 datasets. Nine months later, there were 168,000,37 and there had been 64 million hits on the site.38Read more at location 728
Obama’s executive order intended to establish—to use a software industry term—a new default. A software default is the configuration of options with which software ships; the user has to take special steps to change them, even if those steps are as easy as clicking on a check box. Defaults are crucial because they determine the user’s first experience of the software: Get the defaults wrong, and you’ll lose a lot of customers who can’t be bothered to change their preferences, or who don’t know that a particular option is open to them. But defaults are even more important asRead more at location 730
symbols indicating what the software really is and how it is supposed to work. In the case of Microsoft Word, writing multi-page, text-based documents, and not posters or brochures, is the default. The default for Ritz crackers, as depicted on the front of the box, is that they’re meant to be eaten by themselves or with cheese.39Read more at location 734
We know that there could be another hundred, thousand, or ten thousand columns of data, and reality would still outrun our spreadsheet. The unimaginably large fields of data at Data.gov—we are back to measuring stacked War and Peaces—do not feel like they’re getting us appreciably closer to having a complete picture of the world. Their magnitude is itself an argument against any such possibility.Read more at location 748
Data.gov and FuelEconomy.gov are not parliamentary blue books. They are not trying to nail down a conclusion. Data.gov and the equivalents it has spurred in governments around the world, the massive databases of economic information released by the World Bank, the entire human genome, the maps of billions of stars, the full text of over 10 million books made accessible by Google Books, the attempts to catalog all Earth species, all of these are part of the great unnailing: the making accessible of vastRead more at location 751
quantities of facts as a research resource for anyone, without regard to point of view or purpose. These open aggregations are often now referred to as “data commons,” and they are becoming the default for data that has no particular reason to be kept secret.Read more at location 755
As news spreads from person to person, it sprays out across far wider networks. This is vital because, as Lakhani’s study of InnoCentive discovered, “the further the problem was from the solvers’ expertise, the more likely they were to solve it.”28Read more at location 1040
software fix but because by then the network will have gotten sufficiently expert in the workarounds.Read more at location 1110
Note: sorry to share so much it seems the best way to accumulate the highlights I want is there a good alternative? Edit
The Net becomes more of an expert not just from the content people create for it but also from the links they—we—draw among the pieces. Linking curates the Net. Yet links are content, too. Indeed, one important type of expertise is being able to run the maze of links. The accumulation of links makes the accumulation of content on the Net ever more usable (because it can be found) and valuable (because a context grows around each piece of content).Read more at location 1111
Note: thus an assistant professor who makes links used and cited often should get tenure Edit
context and richness.Read more at location 1189
Note: not shared from on Edit
“True enough, we have internal experts we can draw on,” says Cenkl, “but we’ve also realized that we need to change the way we present ourselves. It’s not necessarily that the MITRE person is the smartest person in the room. We’ve decided that the model needs to evolve so that we become the brokers of expertise. Our value is that we understand the government’s problems really, really well and we can bring the entire community to bear.”Read more at location 1190
In fact, “[t]he forums don’t necessarily come to consensus. There’s a desire to use all the expertise available but not a pressure to drive to the right answer for all circumstances,” adds Les Holtzblatt, chief Work Practices architect.38 Why? Because networks of interacting experts are smarter than the accumulation of individual expert opinions whether we’re using a simple mailing list or a more highly structured knowledge management community. This is a fundamental distinction not only in the way expertise is derived but also in its nature: MITRE, which is in the expertise business, finds it often delivers more value to its clients when it involves them in a network of experts who have differing opinions.Read more at location 1197
Expertise was topic-based. Books focus on specific topics because they have to fit between covers. So, in a book-based world, knowledge looks like something that divides into masterable domains. On the Net, topics don’t divide up neatly. They connect messily. While people of course still develop deep expertise, the networking of those experts better reflects the overall truth that topic boundaries are often the result of the boundaries of paper.Read more at location 1204
Expertise’s value was the certainty of its conclusions. Books get to speak once. After they’re published, it’s expensive for the authors to change their minds. So, books try to nail things down.Read more at location 1208
But because the multitude of people on the Internet are different in their interests and abilities, a network of experts is of many minds about just about everything. The value of a network of experts can be in opening things up, not simply coming to unshakable conclusions.Read more at location 1209
Expertise was often opaque. While experts’ reports usually tell us how their conclusions were derived, and typically include supporting data, we don’t expect to be able to go back very far in the experts’ thinking: The reports and their included data have been our stopping points. Networked expertise puts in links to sources—and even to contradictions—as a matter of course. A simple search is likely to turn up contextualizing information about the expert and about the information the expert is relying on.Read more at location 1211
Expertise was one-way. Books areRead more at location 1215
the original form of broadcasting, a one-to-many medium: The reader can write a big red “No!!!” in the margins, but the author will never know about it. The Net, on the other hand, is multi-way. Any expert who thinks she will talk and we will simply listen has underestimated the Net. We will comment on her site, and if she doesn’t permit comments, we will angrily note that fact on blogs, on Twitter, on Facebook. This multi-way interactivity can make a network of experts more creative, and more responsive to the multitude of ideas and opinions in the world. It can, of course, also create and propagate misinterpretations of the expert’s ideas.Read more at location 1215
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 82 | Loc. 1474-76 | Added on Friday, May 18, 2012, 10:59 PM
First, the members of the group have a smaller pool of views from which to drink. Second, because people “want to be perceived favorably by other group members,” they will often adjust their views toward the dominant position. “In countless studies, exactly this pattern is observed.”
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 83 | Loc. 1482-84 | Added on Saturday, May 19, 2012, 09:14 AM
“cybercascades” in which a belief rapidly gains many believers because it is being passed around the Net as true. Plus, “[a] number of studies have shown group polarization in Internet-like settings.” 29
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 83 | Loc. 1497-98 | Added on Saturday, May 19, 2012, 09:17 AM
That is, those visiting the most obvious examples of partisan echo chambers are also more likely than most people to visit sites on the other side of the political divide.
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 87 | Loc. 1560-62 | Added on Saturday, May 19, 2012, 09:31 AM
so there’s not enough common ground to even begin a discussion. The only place we have the sort of rational discussions Gore, Sunstein, and Socrates value so highly is within an echo chamber—a room in which people agree thoroughly enough that they can disagree reasonably.
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 88 | Loc. 1572-75 | Added on Saturday, May 19, 2012, 11:46 AM
It’s important to be clear about this. We still need as much difference and diversity within the conversation as we can manage. We still need to continuously learn how to manage to include more diversity. We need to be on guard against the psychological tricks echo chambers play, convincing us that our beliefs are “obviously” true and nudging us toward more extreme versions of them. But it’s also fine for the
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 89 | Loc. 1588-91 | Added on Saturday, May 19, 2012, 11:48 AM
As a new member getting to hang out with the scholars whose work had guided me, I was struck by the fact that the sorts of things they were saying about Derrida were precisely the sorts of things non-Heideggerians said about Heidegger: He was out to shock, incoherent, purposefully vague, an intellectual charlatan. We were, in short, having a classic echo chamber moment, refusing to take seriously claims that challenged our own. The irony is, of course, that the rest of the world would consider Derrida and Heidegger to be overwhelmingly alike in their ideas.
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 89 | Loc. 1596-98 | Added on Saturday, May 19, 2012, 11:50 AM
All knowledge and experience is an interpretation. The world is one way and not others—the stone you stubbed your toe on is really there, and polio vaccine works quite reliably—but our experience of the world is always from a point of view, looking at some features and not others.
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 90 | Loc. 1598-1600 | Added on Saturday, May 19, 2012, 11:51 AM
Interpretations are social. Interpretation always occurs within a culture, a language, a history, and a human project we care about. The tree is lumber to the woodcutter, a place to climb to the child, and an object of worship to the Druid. This inevitably adds human elements of uncertainty and incompleteness.
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 90 | Loc. 1601-4 | Added on Saturday, May 19, 2012, 11:52 AM
There is no privileged position. There are always many ways to interpret anything, and none can claim to be the single best way out of its context. Some postmodernists talk about this in terms of denying that there are “privileged” positions, intentionally invoking not only the Einsteinian sense (all motion is relative) but also, pointedly, the socioeconomic sense (the elite should not get to marginalize the ideas of the rest).
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 90 | Loc. 1604-8 | Added on Saturday, May 19, 2012, 11:54 AM
Interpretations occur in discourses. You can’t make sense of something outside of a context. Even something as simple as a car’s turn signal can only be understood within a context that includes cars, the basics of physics, the unpredictable intentions of other drivers, the restrictions of law, and the way left and right travels with one’s body. Ludwig Wittgenstein talked about this in terms of “language games,” by which he meant not something you do for fun but, rather, the way our words and actions are guided by implicit rules and expectations. Postmodernists have many different words for these contexts, but we’ll use the term “discourses.”
==========
Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P (David Weinberger)
- Highlight on Page 90 | Loc. 1611-12 | Added on Saturday, May 19, 2012, 11:59 AM
themselves social constructions—they are ways people within a culture put ideas together. They are not themselves part of nature, and they change throughout history.
==========