Since the formulation of Gödel's first incompleteness theorem, it is well known that for any comprehensible list of set theoretic axioms, there will be statements neither provable nor unprovable from those axioms. What is really interesting, however, is how many of the most natural questions about sets are not decidable by the standard axioms of Zermelo-Fraenkel-Choice (ZFC) set theory, and how many ways of deciding these questions there are available.
In this state of problematics, Cohen managed to establish the independence of the Axiom of Choice (AC) from Zermelo-Fraenkel (ZF) and the independence of the Continuum Hypothesis (CH) from ZFC. This became possible by developing a novel technique, called forcing, that constitutes a far-reaching generalization of the logical notion of implication, which he used for extending a standard model of set theory.
Cohen's extension method constitutes another significant case of the paradigmatic schema of analysis we developed in the previous Section in relation to Gödel's first incompleteness theorem. This schema involves a process of indirect self-reference via unfolding to a new logical level of hypostasis by utilizing a particular gnomon, that is a precise "metaphor organon'' furnishing bidirectional bridges for translating between the initial standard model of set theory and some novel model of set theory being internally distinguishable from the former one.
The method of forcing consists in the instantiation of a novel model of set theory from a standard model by the adjunction of certain sets with particular properties via certain conditions, called "forcing conditions'' encoding information about those sets.
Intuitively, some conditions are stronger than others, and this serves as a criterion for partially ordering them within the ground standard model. In this state of affairs, proving general results about how the elements of a partially ordered set force certain conditions to hold allows one to prove statements referring to the novel constructed models, without looking closely at the forcing conditions themselves anew in each particular case. It is this generality that lends the method its efficiency and universality.
The fundamental significance of Cohen's method of forcing are the following:
(1) The forcing extension of the ground standard model satisfies the axioms of ZFC set theory.
(2) Every proposition that holds in the extension, is forced by some condition in the partially ordered set.
(3) The forcing relation is definable in the ground model.
The crux of the matter, is the delineation of an appropriate generic filter that actually accomplishes the required extension. In the case of a countable and transitive model ground model, for any chosen partial order of elements from the ground model, to be interpreted as forcing conditions, there exist only countably many dense subsets of the partially ordered set, which may be enumerated externally.
The restriction that the ground model of set theory stands for a countable and transitive model can be lifted effectively. In this case, it is not possible to delineate a generic filter, and consequently, the method of "Boolean values'' seems to provide the most general approach, in view of the notion of a Boolean topos.
From a philosophical standpoint, Cohen’s main innovation lies in the distillation of the notion of forcing via a chosen partial order in the ground model, and the conception of existence of a generic filter in this partial order containing elements not already grasped in the ground model. This innovation gave the possibility to secure suitable properties of a novel set borne by the unfolding of a standard model by a generic filter without having all of their members in hand ab initio.
The essential idea is that the conceptual manoeuvre of evaluating statements by means of the Boolean algebra completion of the chosen set of forcing conditions in the ground model of set theory, provides the means to tackle the problem that the elements of the generic set are not specified ab initio.
This gives rise to a non-standard Boolean-valued model of set theory. In other words, a Boolean-valued model, besides the crystallized elements, contains elements that are "partially or locally distinguishable'', where the extent of their distinguishability is provided by the "truth value'' they are assigned in the whole Boolean algebra (thought of as a domain of truth values) and not in the standard bivalent true/false one. This logical manoeuvre in the specification of the Boolean-valued set allows us to think of it as a set of "potential members'' in the process of "constellatory unfolding'' from the ground level to a new level in "statu-nascendi'', to be eventually crystallized.
Cohen's extension method via the technique of forcing can be also interpreted temporally, i.e. by means of a "temporal unfolding dimension'' from the level of ZFC standard models of set theory to the level of non-standard Boolean-valued models, which may be thought of topologically as a spirally unfolding dia-stasis that covers the ground level.
This is suggestive of the semantics given to the elements of a Boolean-valued model, namely as being only eventually crystallized, giving rise in this manner to a new standard model of set theory. The difference in comparison to Gödel's case is in the conception and utilization of a different type of unfolding, which is made possible through the forcing method by delineating as a gnomon a generic set not being of the initial ground model.
More precisely, Gödel's argument may be thought of as involving a process of unfolding in relation to an inner model, whereas Cohen's argument a process of unfolding in relation to an outer model, although the definability of the forcing relation involves the ground model.
... it’s somewhat curious that in a certain sense the continuum hypothesis and the axiom of choice are not really difficult problems-they don’t involve technical complexity; nevertheless, at the time they were considered difficult. One might say in a humorous way that the attitude toward my proof was as follows. When it was first presented, some people thought it was wrong. Then it was thought to be extremely complicated. Then it was thought to be easy. But of course it is easy in the sense that there is a clear philosophical idea. There were technical points, you know, which bothered me, but basically it was not really an enormously involved combinatorial problem; it was a philosophical idea.
... the [generic] set G will not be determined completely, yet properties of G will be completely determined on the basis of very incomplete information about G. I would like to pause and ask the reader to contemplate the seeming contradiction in the above. This idea as it presented itself to me, appeared so different from any normal way of thinking, that I felt it could have enormous consequences. On the other hand, it seemed to skirt the possibility of contradiction in a very perilous manner. Of course, a new generation has arisen who imbibe this idea with their first serious exposure to set theory, and for them, presumably, it does not have the mystical quality that it had for me when I first thought of it. How could one decide whether a statement about G is true, before we have G? In a somewhat exaggerated sense, it seemed that I would have to examine the very meaning of truth and think about it in a new way.
To work inside [the ground model] V, we consider the set of $P$ which forces a given set to lie in [the generic set] G or not lie in G. Because forcing is defined in V, we can look at all possibilities of assigning sets of P, which force the members of x to lie in an arbitrary y. This set is the "truth value'' of the statement. So, in ordinary set theory a subset of x is determined by a two-valued function on the members of x. In our situation, a subset is determined by a function taking its values in the subset of the elementary conditions. These values are all in the model V. Thus we can quantify over all possible truth values...