cours/sources/zotero/merakchiHardProblemConsciousness2021.md
Oscar Plaisant f91c506a9e update
2025-03-16 18:05:45 +01:00

7.3 KiB
Raw Blame History

zotero-key, zt-attachments, citekey, aliases
zotero-key zt-attachments citekey aliases
N6CNAC3L
618
merakchiHardProblemConsciousness2021
The hard problem of consciousness

up:: zotero literature notes link:: Zotero attachment #s/PKM #zotero #t/source

[!zotero]+ The hard problem of consciousness - Page Consciousness is best defined as our ability to be aware of ourselves and the world around us. It is this one undeniable feature of reality that is utterly invulnerable to all doubt.

[!note] Notes Problem : this just wraps the definition of consciousness into the word "aware", which is not really precise and clear. Especially, saying that awareness is fundamentally impossible for matter implies that you already include anti-materialism in your definition of consciousness. ^MF6RRDBLaQEN7G3X4

[!zotero]+ The hard problem of consciousness - Page Neuroscience can indeed deal with what David Chalmers distinguished as “the easy problem of consciousness”, which engages with questions about the brains mechanisms, awakeness and sleep, the ability to discriminate and categorize stimuli, behaviors, the structure of the brain, its dynamics, and in a general way the functions of consciousness.

[!note] Notes Here lies an important part :  that the "hard problem of consciousness" if fundamentally different from the "easy problem of consciousness". Obviously, if you define consciousness using awareness, and if you say that awareness is fundamentally not material, then it is obvious. But then you are basically saying that consciousness is not material because consciousness is not material : all your deductions being contained in your very definition of consciousness, it is mathematically true, but just begs the questions : "is your definition correct ?" and "is awareness really fundamentally immaterial ?" ^QXF5AY9JaQEN7G3X4

[!zotero]+ The hard problem of consciousness - Page Subjective first-person experience is not something that we can know or understand via brain states only.

[!note] Notes This is wrong because it doesn't make the difference between knowing and simply understanding. It is true that you cannot intuitively know a qualia. But it is yet to prove that you cannot understand it. After all, a qualia could very well be just the state of brain cells, and that wouldn't make it any easier to communicate ! Analogy with virtual machines. ^A9CBZ28AaQEN7G3X4

[!zotero]+ The hard problem of consciousness - Page but no amount of collected data or mapped out brain states can make you know what is it like for me to experience the color red.

[!note] Notes It is true, you cannot know, because your personnal experience is limited to what gets in your brain. Trough data, you can't get what is inside someone's brain, but only a representation of that. This is why you can understand what is someone's experience of the color red, but you can't feel it, you can't know it intuitively.

Analogy with CPUs that are both T-complete (hence equivalent), but that cannot execute the same instructions. ^9ZTL8N6DaQEN7G3X4

[!zotero]+ The hard problem of consciousness - Page All that data doesnt suffice for someone who has been blind since birth, for example, to know the experience of the color red because there is no property in the object that allows us to experience the object.

[!note] Notes Is this due to a fundamental lack of matter, or is it simply because we aren't intelligent enough to process all this information intuitively ? If a very intelligent being were able to get all the data about our brains and to intuitively understand everything that is going on, we could probably say that he knows what it is like for a human to see. ^CTVX2QM3aQEN7G3X4

[!zotero]+ The hard problem of consciousness - Page Understanding the neurochemical activity that correlates to pain is still not sufficient to know what it is like for a particular person to be in pain.

[!note] Notes True : you would also need to understand how the brain processes this pain to put it into consciousness. ^PT256H9PaQEN7G3X4

[!zotero]+ The hard problem of consciousness - Page The brain states are not causative for the conscious states. A good example of that is Thomas Huxleys analogy about how Aladdin rubbing his lamp is correlated to yet unaccountable for the appearance of the genie.

[!note] Notes This is wrong. Even if we admit that there is an external magical cause to consciousness, it is a fact that the brain has some influence on the consciousness. Brain states are causative for the conscious state, even if they are not the one and only cause.

And the fact that consciousness is caused by something else than brain state is yet to prove. Qualias only show that a consciousness cannot understand another consciousness (or that a brain cannot mimmic another brain). ^R3Y4WYRXaQEN7G3X4

[!zotero]+ The hard problem of consciousness - Page In his book “an atheist guide to reality”, Alex Rosenberg argues that neurons cannot account for “intentionality” because they cannot be “about” themselves or “about” anything outside of themselves. He explains that “intentionality” as a feature of consciousness cannot be an intrinsic property of physical objects.

[!note] Notes This is because "intentionality" is not a property. Being "about something" is not a physical property.

Intentionnality is created by consciousness (it is a part of it, so it is caused by it). Therefore :  - talking about the intentionality of neurons is a scale error, like talking about the "sharpness" of individual atoms (see the section about emergence)  - it is a circular argument : you define "being about" as something that cannot be achieved via material things. Then you say that consciousness requires it. But ^8R6AN9THaQEN7G3X4