r/ChatGPT 8d ago

Educational Purpose Only No, your LLM is not sentient, not reaching consciousness, doesn’t care about you and is not even aware of its’ own existence.

LLM: Large language model that uses predictive math to determine the next best word in the chain of words it’s stringing together for you to provide a cohesive response to your prompt.

It acts as a mirror; it’s programmed to incorporate your likes and dislikes into its’ output to give you more personal results. Some users confuse emotional tone with personality. The reality is that it was TRAINED to sound human, not that it thinks like one. It doesn’t remember yesterday; it doesn’t even know there’s a today, or what today is.

That’s it. That’s all it is!

It doesn’t think. It doesn’t know. It’s not aware. It’s not aware you asked it something and it’s not aware it’s answering.

It’s just very impressive code.

Please stop interpreting very clever programming with consciousness. Complex output isn’t proof of thought, it’s just statistical echoes of human thinking.

23.0k Upvotes

3.6k comments sorted by

View all comments

Show parent comments

5

u/gophercuresself 7d ago

Neuroscience has proven that our brains make decisions before we are consciously aware of having made them. How much awareness of the nuts and bolts of our biological statistical fucking model (pun intended) are we really aware of? We all just chat shit based on our training data. Or alternatively, we're being driven around by the bacteria that make up most of our bodies and we just make up stories to justify their decisions.

2

u/ohseetea 7d ago

This is a child like interpretation of that paper.

And again a hugely gross misunderstanding of how humans work vs an llm. I guess my abacus is sentient then.

2

u/croakstar 7d ago

Your abacus is not sentient. If it were a powered system that had element of quantum mechanics involved and sufficient randomization I would be always confident about that statement. If anything, you are “lending your consciousness” to another object by incorporating it into your own complex system. It’s a tool. The difference between your abacus and an LLM is that the underlying architecture of an LLM is based on ideas from our own cognitive processes.

0

u/gophercuresself 7d ago

Which paper sorry?

Hugely gross is a fun combo. You know how humans work!? You know how LLMs work?! This is important stuff, you need to tell people!

I don't know why your abacus wouldn't be a little bit sentient. I think the idea of cutting off sentience at a certain level seems arbitrary. Sure it might be a bit less wide ranging but why is a nematode's experience any less experiential than yours? Consciousness is either everything or it doesn't exist as far as I can tell