My claim is that an llm acts the same way (or superset) to how a person with short term memory would behave if the only mode they could communicate with was text. Do you agree?
And I do not agree. LLMs are literally incapable of understanding the concept of truth, right/wrong, knowledge and not-knowledge. It seems pretty crucial to be able to tell if you know something or not for any level of human-level intelligence.
Again, this conversation has been had in many variations constantly since LLMs were on the rise, and we can't rehash the same points over and over. If one believes LLMs are capable of cognition, they should offer formal proof first, otherwise we're just wasting our time.
That said, I wonder if there are major differences in cognition between humans, because there is no way I would look at how my brain works and think "oh, this LLM is capable of the same level of cognition as I am." Not because I am ineffably smart, but because LLMs are utterly simplistic in comparison to even a fruit fly.
>And I do not agree. LLMs are literally incapable of understanding the concept of truth, right/wrong, knowledge and not-knowledge. It seems pretty crucial to be able to tell if you know something or not for any level of human-level intelligence.
How are you so sure about this?
> If one believes LLMs are capable of cognition,
honestly asking: what formal proof is there for our own cognition?
My claim is that an llm acts the same way (or superset) to how a person with short term memory would behave if the only mode they could communicate with was text. Do you agree?