Discussion about this post

User's avatar
Jim Owens's avatar

There's so much in this post that I can only tackle little bits at a time.

The first thing that occurred to me is that the input-output model is suggestive of a "black box." Here, normally, we think of a literal box with something mysterious inside. When a brain or a computer is the box, we might think of "consciousness" or "mind" hidden inside it. Your thesis, on the other hand, suggests that consciousness is the box. Things do not go in and out of it; rather, it exists as the nexus of an exchange. Which makes me wonder if the model of input-output is the best one.

The second thing, and where I had to stop because I can't hold many thoughts at once, is about ChatGPT's statement that DQN "allows future value to propagate backward through time" (By the way, it seems remarkable that ChatGPT can use emphasis effectively). You've added that "Of course, it is not really gaining information from the future — that’s impossible, even for us." But here I have to stop and think, because ChatGPT's phrasing hints at some interesting insight into entropy. It's gaining information from possibility, and possibility concerns the future. The information it gains is not hard information about what has already happened, but it could be called "soft" information about what's likely to happen in the future, based on knowledge of what has happened already (its training dataset), and the factor of what it does next. Likewise we can gain "soft" information about the future; we can't know what will happen, but we have some knowledge about what's likely to happen if we do this or that.

I'd love to spin these thoughts into some full-blown philosophy, but I can't, so I'll just leave them as things to think about.

Expand full comment
Mike Smith's avatar

Interesting experiment! I see experience as being in the eye of the beholder. Along those lines, it seems like Snakey has some things but not others, which you discuss. It not having pain or pleasure is, I think, a different side of the fact that it doesn’t learn (after its training data). I think that’s what pain and pleasure are for speaking in evolutionary terms.

I usually talk about this stuff in terms of a hierarchy, in this case, a functional one, with each next level utilizing a large causal cone.

Automatic reactions to stimuli and fixed action patterns

Models of the environment centered on the body

Causal models

Recursive models of the above

If I’m understanding correctly, Snakey seems dominated by 1, although if we squint there might be some 2 in there. But without learning, it seems pretty limited. 3 is hard to see existing without learning and memory. And of course 4 requires a pretty sophisticated system.

Which one of these are necessary for us to say experience is present? All we can say is the experience of a mature mentally complete adult human includes all 4. But 3 triggers empathy in us when we see evidence for it. 2 does to a lesser extent. 1 can, but it’s far from universal.

Totally agree that genes are evolutionary memory.

Expand full comment
19 more comments...

No posts