Fear of death.

Message Bookmarked
Bookmark Removed
Not all messages are displayed: show all messages (1026 of them)

(in fact this is what is disturbing about the reports from split-brain patients -- that so much of what we would like to think of as an integral, coherent consciousness is mostly autonomic muck)

Philip Nunez, Friday, 13 July 2012 17:33 (eleven years ago) link

but conscious experience is utterly unlike anything else in the natural world! It is a completely different kind.

i'm not sure that this is true. to the conscious mind, consciousness seems special, but this seeming is not necessarily proof of anything but how things seem. or rather that things seem. i think it's likely that awareness of the sort we experience is not something that simply IS or IS-NOT, but rather is something that accretes gradually as certain types of intelligence develop or evolve. i wonder how large a role self-awareness plays in the development of consciousness - not merely to know things, but to know what one knows and how one came to know it, to know not only the world, but oneself in it. it doesn't seem unreasonable to think that something like conscious self-awareness might develop in bits and pieces out of such knowings, out of recursive layers of self-knowledge.

i guess we have to ask at this point whether "awareness" is anything but a kind of information processing. it might be, but i don't see any compelling reason to suppose so. its quality of "seeming" seems unique and possibly non-physical to us, but i expect that this kind of awareness is really just information processing. in fact, the insistence that consciousness must be a thing of a super-special sort strikes me as odd.

contenderizer, Friday, 13 July 2012 17:40 (eleven years ago) link

i guess i mean that a mechanism (biological or otherwise) that experiences the first faint, super-primitive glimmerings of conscious awareness does not necessarily contain any special kind of energy or matter that a similar but non-conscious machine would lack.

the presence of awareness does not require the introduction of a new "stuff of awareness" or "energy of awareness" into the system. rather, "awareness" is simply a way of describing a particular arrangement of what's already there.

contenderizer, Friday, 13 July 2012 17:50 (eleven years ago) link

xp

Adult human consciousness is super-special only in terms of its extreme complexity. The obvious difference between adult human consciousness and infant consciousness suggests the degree to which what we experience within our conscious mind is extremely contingent on post-natal experience and cannot in any way be separated from it. This grounds consciousness decisively in the material world, imo. It can seem to emerge somehow out of the nebulous mists of selfhood because of all we forget and all we ignore.

Aimless, Friday, 13 July 2012 17:57 (eleven years ago) link

I have no problem with a gradual accretion of awareness, but there has to be stuff to accrete and I just don't see how matter (as we normally consider it) can be that stuff, for me the conceptual gap between the physical and the mental is just too great. They are just different logical kinds. That you don't share that view, well I guess nothing remains but to gaze uncomprehendingly through the glass at each other. xp.

ledge, Friday, 13 July 2012 18:02 (eleven years ago) link

we get into trouble, i think, when we treat our own, fantastically complex and multilayered adult awareness as the default example of what "awareness" is. this is like treating a jet fighter as the default example of "mode of transportation". how could such a thing just come to exist? what precedent for it is there elsewhere in the natural world? neither thing just came to exist. both are the product of millennia of development and refinement. both echo ancestors so primitive we could hardly recognize the one in the other.

and aimless otm. even in humans, conscious awareness seems to develop more than simply exist. though, of course, it's impossible to say for certain...

contenderizer, Friday, 13 July 2012 18:04 (eleven years ago) link

I don't think that is the problem! No serious student of philosophy of mind would assume human consciousness just winked into existence.

ledge, Friday, 13 July 2012 18:08 (eleven years ago) link

I have no problem with a gradual accretion of awareness, but there has to be stuff to accrete and I just don't see how matter (as we normally consider it) can be that stuff, for me the conceptual gap between the physical and the mental is just too great.

i view the stuff in question as information, or rather as information-processing systems/patterns/whatever. information isn't really "matter" per se, but it is encoded materially, and it's in the action and interaction of the material involved that the processing takes place.

contenderizer, Friday, 13 July 2012 18:11 (eleven years ago) link

And I agree that consciousness is "grounded" in the brain, I don't subscribe to free-floating mental phenomena. But mental events are not identical with or reducible to physical ones, and calling them emergent doesn't get you any further. But I'm just repeating myself now.

xp, that seems to be begging the question, or putting the cart before the horse, or something. To call something 'information' assumes a thinking, aware subject, not vice versa.

ledge, Friday, 13 July 2012 18:13 (eleven years ago) link

I don't think that is the problem! No serious student of philosophy of mind would assume human consciousness just winked into existence.

i wonder. when reduced to its minimal essence, what is awareness? what is the least thing that might qualify? could awareness lack a sense of self, a sense even of will? probably. could it lack language, emotion and memory? perhaps. since we can't really know any awareness but our own, we can only speculate about other sorts. it's possible that plants and even computers are aware in ways we can't perceive. and it doesn't require that we bring any new stuff to our existing conceptions of these things. at least not so far as i can see. to process is to be at least theoretically capable of awareness.

contenderizer, Friday, 13 July 2012 18:18 (eleven years ago) link

To call something 'information' assumes a thinking, aware subject, not vice versa.

A sunflower can respond to sunlight by turning toward the sun. The sunlight provides the plant with the information about which direction to turn. I presume you accept this as proof that sunflowers are both thinking and aware. It makes reasonable sense to me to extend these definitions to cover this case, but I am not sure this is how you meant it.

Aimless, Friday, 13 July 2012 18:23 (eleven years ago) link

To call something 'information' assumes a thinking, aware subject, not vice versa.

i am not sure that this is true. imagine a simple organism that can sense light and move towards it through a fluid medium. it's constructed to remain in the warmer, oxygen and life-rich top layers of the ocean. though it is not "aware" in any conscious sense (i invented it, so i get to decide), it is able to gather information about its environment and respond accordingly. the information it gathers is pretty much limited to "where's the light at", but that's sufficient for its purposes.

in that creature is a switchboard that coordinates light-sensing and the actions by which it moves in the direction of the light it's sensed. this switchboard isn't really a "brain" yet, and doesn't need to think or feel. it just does a certain thing under a certain condition, on or off. over time, as the creature evolves and becomes more complex, it acquires new senses and new behavior routines that are triggered under this or that condition. it takes in and processes a lot more information about its surroundings, but still doesn't need or have awareness ... up to a point.

at some point maybe it does begin to become aware, but this is long after it has begun to process information (evolutionarily speaking).

contenderizer, Friday, 13 July 2012 18:28 (eleven years ago) link

I disagree, I think that to call it information - light = 'good', dark = 'bad', say - and not just mechanical stimulus response, requires a kind of awareness, a capacity to ascribe meaning, albeit at the simplest possible level imaginable. And yes I would be happy to ascribe that kind of awareness to that creature.

ledge, Friday, 13 July 2012 18:31 (eleven years ago) link

Ultimately information requires a conscious observer to ascribe meaning, otherwise it's all just mechanical patterns of dancing particles. Saying that your proto creature is not aware just pushes the problem higher up the chain - the problem of where this magical meaning-ascribing entity comes from.

ledge, Friday, 13 July 2012 18:36 (eleven years ago) link

if this amoeba can negotiate a maze to get to food, i'd say it was aware in a minimal but meaningful sense. it had to have constructed an internal mental model of the maze. (i.e. not just stimulus response)

Philip Nunez, Friday, 13 July 2012 18:40 (eleven years ago) link

well, we could say that the creature is or isn't "aware" in a conscious sense. you seem to be sticking the smallest unit of awareness to a primitive emotion or desire ("light = 'good', dark = 'bad', say"), which seems reasonable, if not the only reasonable way we might break it down.

my point was that viewed from outside, the creature is gathering "information" about its environment whether or not it is aware. i do not think that information gathering of this sort requires awareness. it only requires an environment, a mechanism by which some aspect of that environment can be measured, a biological "goal", and a responsive action that is environment-dependent and seeks to satisfy the goal. whether or not the creature is aware, its relationship to its environment and its biological goal remains the same, so i think it's appropriate to use the phrase "information gathering" in either case.

i guess i'm using a three-tiered system to talk about cognition: non-congnition, non-conscious information processing, and conscious information processing. rocks seem to be non-cognitive, simple organisms and computers seem to be non-conscious information processing systems (though we can't say for sure what is or isn't conscious in some way), and complex organisms process information in a conscious fashion. you're using a two-tiered system in which there's no cognition and conscious cognition, and information only belongs to the latter.

contenderizer, Friday, 13 July 2012 18:47 (eleven years ago) link

The intermediate step toward more complex consciousness is the development of a memory mechanism. That's where computers reside.

Aimless, Friday, 13 July 2012 18:50 (eleven years ago) link

i'd argue that's the final step.

Philip Nunez, Friday, 13 July 2012 18:55 (eleven years ago) link

if this amoeba can negotiate a maze to get to food, i'd say it was aware in a minimal but meaningful sense. it had to have constructed an internal mental model of the maze. (i.e. not just stimulus response)

yeah, but we can make computers that are capable or learning mazes, computers that operate on strictly stimulus response terms. hell, human beings might well be stimulus response machines. the ability to construct mental models is not necessarily the same as awareness. awareness is the experience of being, the experience of things "seeming" a certain way. it's not what you know, but how it feels to know it, how it seems to know it. the having of a self, whatever that means.

contenderizer, Friday, 13 July 2012 18:55 (eleven years ago) link

if some things seem a certain way, it's because that's how it is encoded in your memory. you can't build a computer program to solve and remember a maze without a memory mechanism.
how does a maze seem to a computer? probably boringly literal.

Philip Nunez, Friday, 13 July 2012 19:00 (eleven years ago) link

I have watched a bumblebee return to its nest when some details of the detritus near the entrance have been altered. It flies around the spot for a while gathering clues, comparing them to its old mental map of the area and learning the new configuration. Change things too much and it can't find its nest any more.

As an amateur AI computer programmer I can tell you that the most critical missing element in most AI computer programs is an equivalent for emotions. If a computer had feelings about mazes, it would not find them boringly literal.

Aimless, Friday, 13 July 2012 19:09 (eleven years ago) link

both echo ancestors so primitive we could hardly recognize the one in the other

see that's what i'm talking about when i'm talking about the consciousness of a stone

the late great, Friday, 13 July 2012 19:12 (eleven years ago) link

But mental events are not identical with or reducible to physical ones,

this is where you and i disagree. "mental events" emerge from a plenitude of neurons interacting with one another, in a variety of processes that overlap and parallel one another. not to mention all the information and direction that goes back and forth from other parts of the body. the tiny electrical charges and other physical events are the stuff of consciousness (insofar as consciousness emerges from a certain density of such activity).

flesh, the devil, and a wolf (wolf) (amateurist), Friday, 13 July 2012 19:12 (eleven years ago) link

just because we could never (?) possibly map out the complex of neural activity in the brain and correlate with any exactitude to mental events doesn't mean that mental events don't have a physical basis.

flesh, the devil, and a wolf (wolf) (amateurist), Friday, 13 July 2012 19:15 (eleven years ago) link

can i prove this? not exactly. although experimental neuroscience can demonstrate stuff like which neural networks correlate to certain aspects of cognition (and hence experience).

but i'm not sure it needs to be "proved" in the impossible sense you seem to call for to be convinced.

flesh, the devil, and a wolf (wolf) (amateurist), Friday, 13 July 2012 19:18 (eleven years ago) link

there's an argument for consciousness, intelligence, emotions etc... being an emergent property of sufficient complexity but i dunno. I think it only appears that way because we describe these things in such mushy terms, so of course you need a fair amount of mush and slop to cover all the bases. maybe to meet a properly bounded, minimal definition of consciousness you would only need 3 dice and some twine.

Philip Nunez, Friday, 13 July 2012 19:19 (eleven years ago) link

well sure it's a hypothesis. but even if consciousness were a product of a less complex series of neural interactions it doesn't change the likely fact that every "thought" is an effect (not a "consequence," which would seem to presume some _other_ thing "processing" the interactions) of such activity.

flesh, the devil, and a wolf (wolf) (amateurist), Friday, 13 July 2012 19:23 (eleven years ago) link

Saying that your proto creature is not aware just pushes the problem higher up the chain - the problem of where this magical meaning-ascribing entity comes from.

― ledge, Friday, July 13, 2012 11:36 AM (18 minutes ago)

i think the problem is that we want to treat the self-possessing, seeming-perceiving consciousness as a magical kind of thing. given how little we know about it, it's hard not to do this, but i'm more inclined to treat it as nonmagical. i suspect that the conscious is an evolved product of the need to manage many overlapping layers and patterns of non-conscious information processing. for instance: a simple creature has a damage sensor and an aversion mechanism. a more complex creature has a subroutine dedicated to "sensing" different sorts of pain and selecting an appropriate response. an even more complex creature needs networks of networks that can coordinate an impossibly vast array of information types, the complexities within complexities branching out towards infinity. at this level of complexity, it makes sense to dumb things back down by subordinating all that infernal cognitive complexity to a decider whose only job is to simply feel the general tenor of the whole and just say "ow" when necessary.

a theory: i suspect this is why evolution has placed a final, conscious arbiter at the top of the cognitive decision tree. the arbiter's job is to make moment-to-moment simple sense of deep processes about which it knows very little. rather than deal with the fantastically complex "machine language" of the body, the arbiter instead can simply coordinate the informed decisions made by countless neurological and cognitive subsystems. it has access to libraries of stored memory and highly flexible symbolic coding systems which it can use to measure the present set of environmental circumstances and internal urgings against others experienced in the past. its job is to know things about the self and to make sense of information provided by the brain and body. its job is, in a sense, to be aware. is it any surprise, then, that it actually is aware? that it does "know things" and "feel things"?

this is us. it's what we are. there's nothing magical or non-material about it. we're the workings of the top-level computer whose job it is to know things and make decisions about the rest of the system.

contenderizer, Friday, 13 July 2012 19:24 (eleven years ago) link

that is beautifully explained, contenderizer. much clearer than a host of books i've read recently.

flesh, the devil, and a wolf (wolf) (amateurist), Friday, 13 July 2012 19:27 (eleven years ago) link

of course the question of what we _do_ with such knowledge--whether it is of any real consequence in "coming to terms" with our mortality--is still open....

flesh, the devil, and a wolf (wolf) (amateurist), Friday, 13 July 2012 19:28 (eleven years ago) link

definitely makes blade runner even more resonant than it once was, that's for sure...

flesh, the devil, and a wolf (wolf) (amateurist), Friday, 13 July 2012 19:30 (eleven years ago) link

i think that's very poorly explained tbh

the late great, Friday, 13 July 2012 19:33 (eleven years ago) link

it's a big hand-wave-y analogy substituting familiar material things (damage sensors, computers) for unfathomable things you're not grappling with

i mean you could just substitute "a tiny little green man sitting in a cockpit in my head" for "a computer" and the construction of that paragraph would be as logically sound, except we "know" that computers exist and elves don't, which makes the argument seem very reasonable and reassuring

the late great, Friday, 13 July 2012 19:37 (eleven years ago) link

thanks, amateurist. to put it way more simply...

in evolutionary terms, "top down" holistic awareness is a development that allows for overall decision-making to remain relatively efficient while information gathering, storage and coordination systems proliferate in complexity. of course i can't prove this, but it makes sense and seems likely to me. it's why i'm surprised by the argument that consciousness is inexplicable and perhaps even trans-physical.

of course, that's a "why is consciousness?" argument, and not a "what is consciousness?" one, but i think the two questions are probably related. of course, it's possible that consciousness arises not due to evolutionary pressure, but by other means and/or for other reasons. or maybe god made it, i dunno...

contenderizer, Friday, 13 July 2012 19:39 (eleven years ago) link

the tiny electrical charges and other physical events are the stuff of consciousness (insofar as consciousness emerges from a certain density of such activity

To me this is just like saying "plant enough apple trees and you're sure to get an orange".

ledge, Friday, 13 July 2012 19:39 (eleven years ago) link

timeout -- do you guys consider pre/non-linguistic thought as conscious or un/sub-conscious?

Philip Nunez, Friday, 13 July 2012 19:41 (eleven years ago) link

it's why i'm surprised by the argument that consciousness is inexplicable and perhaps even trans-physical

i'm not arguing for that and i don't think anybody's offered any evidence for it either! i would call it more of an intuition?

i think your computer / machine language / information procesing stuff is straight medieval argument from analogy

the late great, Friday, 13 July 2012 19:42 (eleven years ago) link

great question philip

the late great, Friday, 13 July 2012 19:43 (eleven years ago) link

i mean you could just substitute "a tiny little green man sitting in a cockpit in my head" for "a computer" and the construction of that paragraph would be as logically sound, except we "know" that computers exist and elves don't, which makes the argument seem very reasonable and reassuring

well, i think it's a bit more substantive than that. i'm really just talking about systems dedicated to the collection and processing of information - systems for which computers are a good metaphor, but which have existed in biology for a lot longer than computers, people or even (probably, according to me) awareness.

it may well be that the handy model of computer-type programming and data processing is misleading, that it distorts our sense of how biological cognition and consciousness really work, but i don't see much evidence of that at present. therefore, i'm inclined to use the model until it proves decisively unfit.

contenderizer, Friday, 13 July 2012 19:45 (eleven years ago) link

The argument from analogy is just as strong or as weak as the resemblance between the things analogized. The fact that it was employed by medieval thinkers is unsurprising. Everyone uses it.

Aimless, Friday, 13 July 2012 19:45 (eleven years ago) link

timeout -- do you guys consider pre/non-linguistic thought as conscious or un/sub-conscious?

I think much more thought is non linguistic than is commonly supposed. Perhaps most of it. So it can be conscious, no problemo.

ledge, Friday, 13 July 2012 19:45 (eleven years ago) link

i don't think it's a medieval theory but there's a lot of evidence to show that's not the way it works. for example, the reflexes we have bypass the round trip of executive pain decision-making, and rightfully so, or we'd be burning ourselves on stovetops for longer than necessary.

re: if the non-linguistic thought is conscious, wouldn't we be able to notice it?

Philip Nunez, Friday, 13 July 2012 19:48 (eleven years ago) link

timeout -- do you guys consider pre/non-linguistic thought as conscious or un/sub-conscious?

― Philip Nunez, Friday, July 13, 2012 2:41 PM (3 minutes ago) Bookmark Flag Post Permalink

i don't suspect language (in the expansive sense) is a prerequisite for consciousness. i imagine there are forms of reasoning and (self-)representation characteristic of animals with much less complex neural systems that probably grant them something like conscious experience.

xpost what ledge says. keep in mind that some humans lack the capacity for language but still exhibit behaviors that suggest conscious self-awareness.

and there's nothing wrong with argument by analogy. in fact it's arguably a basic component of animal reasoning!

flesh, the devil, and a wolf (wolf) (amateurist), Friday, 13 July 2012 19:49 (eleven years ago) link

here's a neat trick, try to de-linguify some aspect of your awareness, and i think you'll find it drops into the unconscious realm, and suddenly you'll have lost 5 minutes.

Philip Nunez, Friday, 13 July 2012 19:51 (eleven years ago) link

if the non-linguistic thought is conscious, wouldn't we be able to notice it?

Dogs have no language in the sense you are using the term. They give every appearance of being conscious.

Aimless, Friday, 13 July 2012 19:51 (eleven years ago) link

there's nothing wrong with argument by analogy?!?

the late great, Friday, 13 July 2012 19:52 (eleven years ago) link

why wouldn't dogs have language?

Philip Nunez, Friday, 13 July 2012 19:53 (eleven years ago) link

I guess I was wrong about how you were using the term.

Aimless, Friday, 13 July 2012 19:54 (eleven years ago) link

plants which are not watered wither and die like unfed men and those that are watered tend to grow plump and full. likewise, cutting the roots of a tree will cause it to wither. therefore we can conclude the sustenance of the plant is from water and soil and it is from water and soil that it gains the raw materials of growth.

the late great, Friday, 13 July 2012 19:56 (eleven years ago) link

that's argument by analogy

the late great, Friday, 13 July 2012 19:56 (eleven years ago) link


You must be logged in to post. Please either login here, or if you are not registered, you may register here.