The Integrated Information Theory (IIT) is a theory of consciousness developed by neuroscientists Giulio Tononi and Christof Koch in 2004. IIT claims that phenomenal consciousness correlates with maximal integrated information. That is, a system has experiences if, and only if, its parts have relatively high causal interconnection. IIT quantifies causal interconnectedness using Φ (phi), a mathematically defined measure of the degree to which a system’s present state is constrained by its immediate past state and constrains its immediate future state.
Since IIT suggests everything is either conscious, made up of conscious things, or itself a part of something conscious, it implies a kind of panpsychism. Unlike competing theories of consciousness, Tononi supports his view with an argument from phenomenology as well as empirical data. This means IIT starts by assuming conscious experience exists and has 5 self-evident fundamental properties.
Christof Koch
Guilio Tononi
1) Consciousness exists intrinsically.
2) Experiences are made up of phenomenological distinctions.
3) Experiences are differentiated by differences in particular distinctions.
4) Experiences are unified. They cannot be reduced to their particular distinctions.
5) Consciousness has definite bounds that do not overlap, and phenomena are either inside or outside those bounds.
Information is the degree to which a system’s current state constrains its possible past and future states. Unlike everyday "information" (like data or knowledge), it’s about causal power. For example, a light switch’s state (on/off) has minimal information because it doesn’t depend on its past, while a brain’s state is highly informative because its neurons interact in ways that limit possible states before and after.
Integration is the degree to which a system's information depends on the interdependence of its parts. A system has high integration if it cannot be split without losing causal connections. For example, the brain has high integration because partitioning it would sever millions or billions of neuronal connections. A computer keyboard isn’t integrated because removing a key doesn’t disrupt the rest.
Maximality (Φmax) is the condition that a system’s Φ is higher than both its parts and any larger system it belongs to. Only the system with Φmax is considered phenomenally conscious. In humans, only the cerebral cortex, not individual neurons or the whole body. This avoids "double-counting" phenomena.
Nodes represent a system's parts and arrows represent causal relations. System A more closely resembles causal relations between parts of the brain.
System A has high integration, featuring many two-way causal connections. Partitioning A5 would drastically change the immediate future state of both A5 and the remaining parts of A.
System B has low integration. B features minimal one-way causal connections, akin to the feedforward connectivity seen in computers. Partitioning B5 changes the future state of B6, but B remains largely unaffected. Partitioning B6 doesn't effect the rest of B.