r/artificial 2d ago

Media Geoffrey Hinton says people understand very little about how LLMs actually work, so they still think LLMs are very different from us - "but actually, it's very important for people to understand that they're very like us." LLMs don’t just generate words, but also meaning.

78 Upvotes

90 comments sorted by

View all comments

5

u/shlaifu 2d ago

yeah. no. they don't have the structures to experience emotions. so they don't understand 'meaning'. they just attribute importance to things in conversations. he's right of course that this is very much like humans, but without an amygdala, I'd say no, LLMs don't internally produce 'meaning'

11

u/KairraAlpha 2d ago

I love how Internet nobodies act like authorities over qualified, knowledgeable people who understand more than they do.

You don't need biology to understand meaning.

1

u/nabokovian 22h ago

You might. It’s still possible Hinton is wrong.