Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> There's a great philosophy paper making this argument: Schwitzgebel, Eric. “If Materialism Is True, the United States Is Probably Conscious.”

Maybe, but this strikes me as a borderline category error, like saying, "If databases can order results, then they are probably sorting algorithms". The argument makes assumptions that consciousness is transitive or "sticky" in some sense, where if a property applies to a part of the system it must also apply to some aggregate of those parts.



> The argument makes assumptions that consciousness is transitive or "sticky" in some sense, where if a property applies to a part of the system it must also apply to some aggregate of those parts.

I think the argument is really about substrate-independence. If consciousness is just about functional properties, then why can’t a social collectivity exhibit those functional properties?

Many materialists do in fact endorse substrate-independence - the common belief that an AI/AGI could in principle be as conscious as we are (even if current generations of AI likely aren’t) depends on it - and I think substrate-independent materialism likely does fall victim to this argument. Now, maybe not, if there is some functional property we can point to that individual humans and animals possess but which their social collectivities lack-but then, what is that property?

Other viewpoints don’t endorse substrate-independence. For example, Penrose-Hameroff’s orchestrated objective reduction, if you would call that materialism - I think you can interpret it as a materialist theory (e.g. the in principle empirically testable claim that neurons have a physical structure with certain unique quantum properties, plus the less testable claim that those properties are essential for consciousness) or as a dualist theory (e.g. these alleged unique quantum processes as a vehicle for classical Cartesian interactionism). The more materialist reading could be viewed as a substrate-dependent materialism which escapes Schwitzgebel‘s argument. But I don’t think most materialists want to go there (seems too dualism-adjacent), and the theory’s claims about QM and neurobiology are unproven and rather dubious.


I have no problem with substrate-independence, but given substrate independence that doesn't mean the "consciousness" property is transitive in the way needed to claim the US is conscious. You need additional assumptions beyond just materialism and substrate independence.

Computations are mechanistically clear, deterministic and substrate independent, but they still have properties that we use to classify them. Just because some component of a larger computational system has a property (or classification), it does not entail that the larger system has that property (or classification). Consciousness could be like this.


Schwitzgebel‘s isn’t assuming any property of consciousness is transitive.

Rather what he is saying is this: given many candidate substrate-independent materialist definitions of consciousness, if the property is true of individual humans and animals, it will true of their social collectivities. But, he says “many”, not “all”-he’s not claiming you can’t define properties of consciousness for which that is false, he’s simply putting the onus on the proponents of the “materialism can explain consciousness” project of explaining in detail how, and justifying such a definition.

Furthermore, he’s not claiming that properties of consciousness are transitive from individual organisms to any arbitrary grouping of them. Rather, he’s pointing to social collectivities such as countries or governments as being so coordinated they sometimes act like they have will of their own emerging from the coordinated wills of their individual members. This isn’t true of arbitrarily defined wholes of which those individuals are part, e.g. the set of all humans (anywhere on earth) whose first name starts with the letter A. You are interpreting his point in terms of part-to-whole transitivity in general, but that’s not the case-the emergent properties of social groups are far more complex than simple part-to-whole transitivity. And it is those emergent properties he points to as evidence they may have a consciousness distinct from those of their individual members


> Rather, he’s pointing to social collectivities such as countries or governments as being so coordinated they sometimes act like they have will of their own emerging from the coordinated wills of their individual members.

The complexity of this behaviour could be at the level of an amoeba rather than a human mind. It's just not persuasive without knowing more about the specific characteristics of what materialist consciousness requires.


The difference here is we know the definition of sorting algorithms, but we don't have a working definition for "consciousness". The argument is, if we use materialist definitions, then lots of unexpected things fit the definition.


> The argument is, if we use materialist definitions, then lots of unexpected things fit the definition.

No, this doesn't follow, that's my point. This depends on further constraining materialist consciousness to have the specific kind of property I described. It's possible that consciousness is like "sorting algorithm", a specific property of a specific kind of system, and aggregates of such systems don't necessarily have that same property. They might, but it's too strong to say they definitely or probably do.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: