August 10, 2005

Functionalism at Forty (PDF) A very engaging and accessible Reductionist critique of Functionalism as it enters its fifth decade. "What is Functionalism?" you ask? Well...
  • What is the ultimate nature of the mental? ummm. . . teh funny? heh. sorry - i'll be quiet now
  • For the other side, there's What to say to a skeptical metaphysician. Unfortunately, I can't find a free-access site to it.
  • The essay reminds me of a view put forth in a book called "Chemobiodynamics and Drug Design" by F. W. Schueler (1960). As I recall it, he said that information , mass and energy are equally universal. That is, everything is information. Even two atoms colliding exchange information, but more comes (exponentially?) as we study higher levels of organization: cells, organs, organisms and societies (the topmost example given). This kind of informational hierarchy almost resembles the medieval 'ladder of being' - except that angels were included on that one. So by way of functionalism we again enter the tower of metaphysics, even as we sit like monkeys and think.
  • In the 'Well' link, I've gotten up to the table, and it baffles me. I think it's just wrong. This is what they want: _____S1_____S2 1____odd____even _____->S2___->S1 0____even___odd _____->S1___->S2 And you'd start in S1. (Get 1, output is odd, etc.) Using the one they have, consider the following sequence: Start in S1. In:0 Out:Odd! Goto:S1 In:1 Out:Odd Goto:S2 In:0 Out:Even! Goto:S2 In:1 Out:Even Goto:S1 In short, whenever you have a zero input, you get the wrond output. The kicker is, someone noticed that there was a problem-- "The machine is intended to start in S1, so if its first input is a ‘0’, it will wrongly say that it has seen an odd number of ‘1’s, but once it has seen a one, subsequent answers will be correct. (The flaw is corrected in the next machine.)"-- but just couldn't figure out why or how to fix it (subsequent answers will not be correct, the whole thing is borked). They're correct in noting that the next one fixes the problem. It's not very complicated. To go on a bit of a rant: this is what every natural scientist secretly (or not) suspects while listening to philosophers: lots of fancy talk without ever understanding what the hell they're talking about.
  • exppii - Yeah, that's the way I felt when I took grad-level syntax. petebest - When you paired mental and funny, I immediately thought of Ed Grimley doing the mental dance.
  • ...this is what every natural scientist secretly (or not) suspects while listening to philosophers: lots of fancy talk without ever understanding what the hell they're talking about. I was sympathetic to this view before I became a student of philosophy. And, of course, I'm still a bit skeptical when any one of them (me included) puts forth a thesis peppered, perhaps unnecessarily, with semi-technical verbiage. But I've come to see how philosophers view natural scientists when those scientists start putting forth philosophical views. The natural scientists seem, on balance, to have less facility with a cogent statements of philosophical theses, less ability to appreciate the commitments incurred by their positions and less flexibility in responding to criticisms of their views. In short, it seems like many of the natural scientists who put forth philosophical views do so ham-fistedly, and assume just because they can't appreciate the subtlety of the dialectic that their (philosopher) interlocutors have no understanding of "what the hell their talking about."
  • On the other hand, i'm amazed about how much verbiage this guy needed to put forward his ideas. There really isn't much "there" there. Also it will be natural scientist in the end who is going to find the answer to this question about the nature of mind, if the answer is obtainable.
  • ..it will be the natural scientist in the end... grrrrr
  • Also it will be natural scientist in the end who is going to find the answer to this question It's unclear whether the natural sciences are even suited to pose the question to which you claim they will find the answer.
  • Wash Jones - I remember once seeing a quote saying something to the effect of, "The mind is what the brain does." Speaking personally and as a non-specialist, I think we're nearing the point -- say, a decade or two away -- when the question "What is the ultimate nature of the mental?" will be clearly answered by cognitive neuroscientists.
  • I agree with you cog_nate, but perhaps am a bit more pessimistic. I'd guess three or four decades. It's unclear whether the natural sciences are even suited to pose the question Hmmm, I would fall back on another example. I.e. what is the ultimate nature of matter? That's a question that befuddled a lot of early philosophers and produced mounds of heavy verbiage. But now it is squarely in the ken of natural scientists. It's a tough struggle but the harvest from science is truer than the valiant efforts of philosophers.
  • I anticipate the advances of cognitive science just as much as the next guy. I really do think it's our best hope at gaining insight into mental phenomena. No disagreement there. The reason I shy away from thinking it will reveal the "ultimate nature of the mental" is that cognitive neuroscience, as a hypothetical deductive scheme for investigating physical phenomena must remain silent about non-physical phenomena, if there are any. It seems possible, however unlikely, that consciousness is a not a physical phenomena, but rather an epiphenomenon of purely physical brain processes. Of course, we're likely on the same page regarding this; I think Epiphenomenalism is completely nutty. But until there's a good argument for why it's wrong, I don't think we can dismiss it. Until we can dismiss theses like this, I don't think we can say that we understand the ultimate nature of the mental. I'm not saying that the result of science aren't more satisfying than the "results" of philosophy. Of course they're more satisfying. But it seems to me that before we can begin a rigorous science program, we must have agreed upon at least a few philosophical views -- such as the only things are physical things. Any arguments for those views can't be made from within a materialistic scientific framework.
  • I think we may agree broadly except for this point: before we can begin a rigorous science program, we must have agreed upon at least a few philosophical views Why would I need any philosophical view at all? Unless by view you mean a hypothesis as part of the scientific method? Even then, I'm not even sure we need much hypothesizing to start. Brain scans on people in various mental states have already started to open the mysterious door just a tiny crack. Ditto for left/right brain research. I guess what i'm saying is that lots of pure investigative science can be done without any need for prior philosophical views.
  • Can we even think such a thought without some kind of cognitive framework? Philosophical frameworks may just be more specific than we would like is all.
  • I think Epiphenomenalism is completely nutty. How so?
  • Epiphenomenalism does sound like it puts the onus on the brain as the driving wheel, with the mind a sort of mental hubcap. What's wrong with this picture is that the hubcap may just be how we experience the axle - that part of the brain that lies beneath the event horizon. In zen I think they call that 'no-mind,' a sort of empty potential which has a greater effect when it finally turns.
  • How so?
    Perhaps I shouldn't have said completely nutty, but nuttier than a physicalist / functionalist view of the mental. I'll take David Chalmers and Frank Jackson (property dualists who have flirted with Epiphenomenalism) as specific targets. Chalmer's conceivability arguments are his main support for Epiphenomenalism, and these arguments rest on his development of the 2-D semantic framework, which I think is artificial and misguided. In my more cynical moments, It seems like something he's cooked up to get the philosophy of mind results he wants. As far as Jackson goes, I take Katalin Balog's "Conceivability, Possibility and the Mind Body Problem" (Phil. Review vol 108 no. 4, Oct 1999) to present a good argument against the possibility of a fully intentional being which lacks any phenomenal consciousness (a "zombie"), the existence of which is supposed to lead one to the Epiphenomenalist position. In sum, I think a more convincing semantic theory would be less likely to tempt one to the Epiphenomenalist position. And once we had such a theory, I believe zombies of the type Chalmers thinks are possible would seem much less plausible. BTW - I found the article Gyan was seeking and put it on the web, along with the Balog that I just mentioned.
  • The prize goes to Exppi: MonkeyFilter: Lots of fancy talk without ever understanding what the hell they're talking about. What is the ultimate nature of the mental? It's all chemistry.
  • Electro-colloidal chemistry.
  • My brother went to university with David Chalmers. He (my bro, that is) basically missed out on a Rhodes because that year it was Chalmers first, daylight second in the mathematics department. Apparently he was a completely awesome mathematician, about the best they've ever had in this admittedly hick town. The brother has stayed at DC's place in the States and all that stuff, and I am told they still talk shit on the email thingy.
  • /pointless namedropping
  • I've been to Adelaide, stayed at the Austral Hotel, interviewed and filmed Dr. Richard Jenkins there on the Precambrian fossils of the Ediacara hills. Adelaide is not a hick town to my mind, even if my 'mind' is a merely physical sidedish (apologies to mofi member), and if David Chalmers (new to me, but just googled) cares about your brother, I thank you for an deeply informative post.
  • I take ... Balog ... to present a good argument against the possibility of a fully intentional being which lacks any phenomenal consciousness Well, there's your problem, I don't see how a zombie would have intentional states.
  • I really don't either, but Chalmers has been setting the pace with regard to this issue since his book The Conscious Mind came out in 1996, and he thinks that zombies are conceivable and so metaphysically possible.
  • What I meant was that Balog's rebuttal doesn't work because zombies won't have intentional states. From the paper: Assumption 1: Jackson and zombie-Jackson share most of their intentional states except those involving phenomenal states ... On Assumption 1, Jackson and zombie-Jackson mean the same by their words, except where phenomenal terms are involved. ... I would like now to consider some objections. First, Assumption 1: One might object to it that zombies do not have intentional states at all. ... The most prominent exposition of this view is due to Searle; ... Contrary to Searle, a good case can be made that zombie-Jackson does have intentional states. Zombie-Jackson communicates with his colleagues: he answers questions, his utterances convey information, his actions are made intelligible by the assumption that he has beliefs and desires, etc. Is this defense a joke? The vocal cords of a zombie throwing out air don't mean anything to a zombie. The 'sound' and ultimately semantics is only discernbile to a sentient agent. If Balog is objecting that zombie interactions can't turn out coherent than there's a counter to that i.e. she's presupposing that humans have free will, and which is necessary for meaningful interactions to emerge. The other objection to Assumption 2 was that even if the zombie has intentional states in general, his term 'pain' in fact does not refer to anything. In my view, this is wrong. This position has the counterintuitive consequence that all of the zombie's phenomenal talk lacks truth value. This would be a very uncharitable interpretation of zombies: it would imply that zombies are massively deluded about their mental life. Zombies can't be deluded since they don't have mental states. Then she goes on to talk about "partial zombies" which is an oxymoron. Finally, she attempts to bypass the intentionality barrier by talking of special humans called yogis who can detect brain states without a phenomenal feel (??!).
  • Oh, here's Chalmers reply to Balog.
  • Are you guys serious? Is time being spent in academia on the inner life of zombies? That's a laugh riot! Duuuudes, put down yer pens and get thee to a laboratory!
  • On your earlier point, StoryBored, you do need some prior philosophical assumptions before undertaking scientific research. You have to assume that there is a real external world to investigate, for one thing. You have to assume that your senses are broadly telling you the truth about it. And so on. I sympathise with your point of view, Gyan, but I don't quite see what your argument is. Why does intentionality require phenomenal experience? The best I can come up with is that if a zombie told you his foot was hurting, you might think the statement was in some sense false - because ex hypothesi, he doesn't have subjective experiences, such as pain. I'm not sure I see what compels us to regard his words as meaningless (to him, I mean; obviously they could still have derived intentionality from our point of view).
  • I'm not sure I see what compels us to regard his words as meaningless (to him, I mean Let me clarify: you are asking why should the words be meaningless to the zombie? or to us? If the latter, of course it's meaningful. If the former, then it's meaningless because there's no self. So, "my foot is hurting" doesn't parse. More importantly, a zombie is just a physical machine (albeit a complicated one that looks and acts like conscious humans). It makes as much sense to assign intentionality to a zombie, as to a robotic arm designed to pick up a ball. If we do so, it's just the result of our 'Theory of Mind' modelling, where we model isomorphic behaviour by other agents, in terms of our self-modelling. An intention or belief is a mental disposition held by a phenomenal self. Zombies don't possess selves.
  • Hm - would it be meaningless, or merely false? If it means something to us (a trivial point in itself, I agree), why shouldn't it have the same meaning for the zombie? And why can't zombies have selves? I think you're probably quite right as a matter of fact (I think philosophical zombies are impossible, myself) but I think you're begging the question a bit. Alright, a zombie is just a physical machine - but to a functionalist, a human being is just a physical machine too. If human machines have intentionality, why not non-human machines? If you put me on the spot, I think my argument would have to be that it has something to do with the fact that machines are artefacts, and humans aren't. But I couldn't pretend to have a real knock-down case.
  • On your earlier point, StoryBored, you do need some prior philosophical assumptions before undertaking scientific research. You have to assume that there is a real external world to investigate, for one thing. You have to assume that your senses are broadly telling you the truth about it. Yes that's true strictly speaking. But i would say that most natural scientists would take them for granted and waste no time with it. The assumptions I was thinking more of was the handwavey and ivory tower stuff (e.g. like the zombie thang). What's needed is more troops on the ground not more theorizing. Which is to say (without any proof at all) that the nature of mind is a scientific question, not a philosophical one.
  • If it means something to us (a trivial point in itself, I agree), why shouldn't it have the same meaning for the zombie? And why can't zombies have selves? What do you think a 'self' is? I think philosophical zombies are impossible, myself They are possible. I was just arging that Balog's rebuttal doesn't work. a human being is just a physical machine too. No, and that's the whole point of the zombie argument. That consciousness isn't physical.
  • Since you ask, I think a self, or a person, is an origin (a generator, if you like) of original intentions. There are certain processes to do with the recognition of future and distant contingencies which give rise to new intentions, and wherever they occur, there is a self. They're not, in my view, functionalist-style computation processes, however. In the unlikely event of your wanting a longer exposition (I think you'd just find it tiresome), you could look at the piece on 'personhood' on my own site. I think philosophical zombies are impossible because of the radical disconnection between thought and behaviour they imply. Zombie Plegmund does and says exactly the same things as me, even when he's talking about his subjective experiences, though in fact he doesn't have any subjective experiences. This implies my inner, subjective life is irrelevant to what I say and do, which to me is highly implausible. Since I think zombies don't exist, it follows that I don't think they have intentionality. But (to reprise my original inquiry one last time) since you don't seem to share that view, what is it that makes you so sure they can't have intentionality?
  • Since you ask, I think a self, or a person, is an origin (a generator, if you like) of original intentions. What's an intention? This implies my inner, subjective life is irrelevant to what I say and do, which to me is highly implausible. So, you are committed to free will, and functionalism. How do you reconcile free will and physicalism? Since I think zombies don't exist, it follows that I don't think they have intentionality. No, it doesn't. Whether they exist or not, has nothing to do with whether they would possess intentionality if they did exist.
  • I refer the honourable gentleman to the source I mentioned earlier: my views on all these interesting points are on my own site. It certainly seems to me that things that don't exist don't have intentionality (not original intentionality, that is. Derived intentionality is another matter - I suppose imaginary books could still be about something. The version of Hamlet which Shakespeare gave a happy ending does not exist, but it would still seem to be about Hamlet in some sense.). Whether they would have it if they did exist is surely a different question, and one on which I can offer no conclusive arguments. If the moon were cheese, would it be Stilton or Wensleydale? The impression that's coming across to me is that your views about the intentionality of zombies are actually based purely on intuition. Nothing wrong with intuition; I'm just surprised you can feel so certain about the answer on that basis alone. Perhaps you have religious and/or mystical grounds for your convictions. (That isn't meant to sound sarcastic and I'm sorry if my persistence is annoying.)
  • Enter the linguistic laboratory. Zombies have no will of their own, right? So they are not subjects. But they might yet be characterized as adverbs (walking, say, in a zombie like fashion). Or they might yet be 'extended adjectives' like a walking collection of qualities (glazed eyes, bad breath, etc.). The point is that Functionalism would have it that we're all zombies, not subjects at all.
  • It certainly seems to me that things that don't exist don't have intentionality Things that don't exist, aren't, but that has no general bearing on other attributes. Would you say that a unicorn has no shape? my views on all these interesting points are on my own site. Will all due respect, I'm not doing to dig around for bits & pieces, here & there. You're much more familiar with that corpus of writings than I am. Couldn't you just copy-paste the relevant writing? To come back to matters about the self, suffice to note that a self distinguishes between the self and not-self. What, within the zombie, would perform this function?
  • Things that don't exist, aren't, but that has no general bearing on other attributes. Would you say that a unicorn has no shape? It seems to me that Plegmund is arguing that zombies not only do not exist, but that their existence is logically impossible. The concept is incoherent so it makes no sense to talk about what a zombie would be like if it existed. I think.
  • The concept is incoherent so it makes no sense to talk about what a zombie would be like if it existed. I think. Of course it would. How else do you know they are logically impossible?
  • Here's how I see it. Take the statement, "I believe such-and-such with this-and-that characteristics, which for the purposes of this discussion we'll call a 'zombie,' would not have intentionality." I think Plegmund is arguing that the basic concept of a "zombie" is incoherent; talking about whether an incoherent concept could have intentionality is putting the cart before the horse. If I understand him properly.
  • I think Plegmund is arguing that the basic concept of a "zombie" is incoherent; talking about whether an incoherent concept could have intentionality is putting the cart before the horse That depends on whether he meant "can't" when he wrote "don't".
  • Bah! Incoherency puts no restrictions on zombies. Alright, ignore that too.
  • Bah! Incoherency puts no restrictions on zombies. Aright, ignore that too.
  • Actually it gets me off many hooks. Thanks for letting me listen in.
  • Thanks, Smo. I wouldn't actually go quite so far as to say that the idea of zombies is logically incoherent - but that's not far off. I think the idea of zombies, people who have all the normal mental functions but none of the subjective experience, is about on a par with the idea of Qzma, a substance which is H20 but not water. That contradicts the laws of physics rather than those of logic, but it still makes it pretty hard to have any well-founded views about whether Qzma would be wet if it existed - and I have a similar difficulty in deciding whether zombies would have intentionality.
  • I think the idea of zombies, people who have all the normal mental functions but none of the subjective experience Incorrect. Zombies have no mental functions, as well. Their external behaviour mirrors ours.
  • 'Incorrect?' If you can't manage civility, Gyan, you can fuck off. What do you think zombie Plegmund's encephalogram would look like? The same as mine.
  • 'Incorrect?' If you can't manage civility, Gyan, you can fuck off. Where the fuck was I uncivil? What do you think zombie Plegmund's encephalogram would look like? The same as mine. How does that make any difference? Electrical activity is not internal behaviour, it's activity beneath the skin. Motion of the CSF, or motion of serotonin across the synapse, is no different in essence than my hand moving about.
  • To make it less ambigious, replace 'external' with 'empirical'.
  • The worst of it is that you guys seem to think zombies don't exist.
  • Grump.
  • It seems we're talking about different kinds of zombie (quite apart from the Wolofian variety). Zombie Chalmers "is molecule for molecule identical to me, and identical in all the low-level properties postulated by a completed physics, but he lacks conscious experience entirely... He will be psychologically identical to me, in the sense developed in Chapter 1... He will be perceiving the trees outside, in the functional sense, and tasting the chocolate, in the psychological sense..." That's what I mean - if you think the word 'mental' begs some of the questions, 'brain' is OK with me. There are two zombie Dennetts, on the other hand. The basic model is "a human being who exhibits perfectly natural, alert, loquacious, vivacious behaviour, but is in fact not conscious at all, but some kind of automaton... you can't tell a zombie from a normal person by examining external behaviour..." I think that's what you have in mind. At the top of the Dennett range is a super-zombie, or zimbo, "a zombie that, as a result of self-monitoring, has internal (but unconscious) higher order informational states that are about its other, lower order informational states... at the very least, the zimbo would (unconsciously) believe that it was in various mental states... It would think it was conscious, even if it wasn't!" And now I must leave this interesting discussion, not in any form of huff or dudgeon, but because I'm going to be off-line altogether for a while.
  • Yup, then I mean the basic zombie-Dennett.
  • Well. Then would these message puppets, only speaking like one of us, qualify as zombies, then?