Reminds me of this:

I think atproto is a good protocol, but god bluesky-the-company is dogshit.

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    12 hours ago

    “Consciousness requires semantic understanding” - I don’t see a way to operationalize this that GPT-4 fails. It misunderstands some words, and so does every human I know. You give GPT a definition, it can use it about as well as a school child.

    i would interrogate this plus “intelligence” a little more. LLMs don’t “understand” in the way that we do, and personally I don’t think that they really understand at all. A dictionary containing examples also basically passes this test, for example. LLMs also don’t really have “intelligence”.

    Anyway we’re very far away from figuring out consciousness anyway. Claims about LLMs being conscious are meaningless marketing.