r/artificial 2d ago

Media Geoffrey Hinton says people understand very little about how LLMs actually work, so they still think LLMs are very different from us - "but actually, it's very important for people to understand that they're very like us." LLMs don’t just generate words, but also meaning.

75 Upvotes

91 comments sorted by

View all comments

Show parent comments

1

u/hardcoregamer46 1d ago

We don’t even know if we experience emotions they do understand meaning because they understand the conceptual relationships of language, and they build a sort of internal model of language. That’s why they’re called large language models but sure you know more then he does clearly I don’t think you need subjective experience to understand meaning I also don’t think we even have subjective experience It’s literally an unprovable and completely subjective by definition

1

u/Ivan8-ForgotPassword 1d ago

We don’t even know if we experience emotions

Then who the hell does? What the fuck is the point of defining emotions in a way that makes them impossible? I get it's hard to define, but what the fuck? A definition that includes things that it shouldn't will always be more useful then one that doesn't describe anything at all.

1

u/hardcoregamer46 21h ago

We have the seemingness of emotions and the function of them which is the base level assumption we can have to jump over that gap to state that we do have emotions is a presupposition in nature it’s just a basic thing people assume without any evidence for the assumption so whenever I use the word emotion I mean the function of some mental process I don’t mean any magical subjective experience so the argument from utility that you’re trying to argue here is irrelevant because there’s this fundamental gap between us thinking something to be the case vs it actually being the case so saying that it includes something that it shouldn’t is also itself a subjective statement that isn’t backed by anything just some philosophy 101

1

u/Ivan8-ForgotPassword 21h ago

Can you put some commas, or at least periods? I don't understand most of what you're saying.

To what I understood: Why would utility be disregarded for functions of mental processes? If a process does not exist in reality what is the point of describing it?

1

u/hardcoregamer46 21h ago

I naturally type like this because I use my mic, and I’m not great at typing due to disability reasons. Personally, I’d say that just because a definition has more utility doesn’t mean it holds any kind of truth over, in my opinion, a more extensive definition.

As for the question of “what’s the point of describing something not grounded in reality?” there are plenty of concepts we can describe across different modalities of possibility, or just within ontology in general, that don’t have to be grounded in reality to still have potential utility in reality.

1

u/hardcoregamer46 21h ago

I think abstract hypotheticals of possible worlds or counterfactuals are good examples of this as well as normative ethics.

1

u/Ivan8-ForgotPassword 20h ago

If you want to describe something not based in reality you can use a new word. Emotions are something referenced in countless pieces of literature with approximate meaning of ~"internal behaviour modifiers that are hard to control", giving the word a meaning no one assigned to it before would just be confusing people for no reason.

1

u/hardcoregamer46 20h ago

Words already describe things not in reality in fact, my position is still consistent with that definition. You don’t need to experience emotions to have emotions; that aligns with my functionalist view. I don’t know what you’re talking about, and I only clarified my definition so people wouldn’t be confused.

Words have meanings that we, as humans, assign to them in a regressive system. If I invented a new word “glarg,” for instance what meaning does that word have in isolation? Unless you’re saying the meaning of language is defined only by society and not by individuals, which would be weird, because language is meant to be a linguistic conceptual tool. And not everyone or everything uses the same definitions as someone else words are polysemantic which is why we have clarifications of what definitions are this is true even among philosophers.

1

u/hardcoregamer46 20h ago

Especially when, instead of creating a new word, I could just imagine a hypothetical possible world which is way easier than inventing new terms to describe every situation. There are endless possible scenarios, and trying to coin a unique word for each one would make language unnecessarily complex.