• doodledup@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    20
    ·
    edit-2
    5 months ago

    You’re doing reasoning based on chemical reactions. Who says it can’t do reasoning based on text? Who says it’s not doing that already in some capacity? Can you prove that?

    • MentalEdge@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      5 months ago

      Is language conscious? Is it possible to “encode” human thinking into the media we produce?

      Humans certainly “decode” ideas, knowledge, trains of logic and more from media, but does that mean the media contains the components of consciousness?

      Is it possible to produce a machine that “decodes” not the content of media, but the process through which it was produced? Does media contain the latter in the first place?

      How can you tell the difference if it does?

      The more I learn about how modern machine learning actually works, the more certain I become that even if having a machine “decode” human media is the path to AGI, LLMs ain’t it.

      It just doesn’t work in a way that would allow for a mind to arise.

      • NounsAndWords@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        5 months ago

        Is language conscious?

        Are atoms?

        I don’t know if LLMs of a large enough size can achieve (or sufficiently emulate) consciousness, but I do know that we barely know anything about consciousness, let alone it’s limits.

        • mke@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          edit-2
          5 months ago

          Saying “we don’t know, and it’s complicated, therefore there’s a chance, maybe, depending” is barely the beginning of an argument.

    • mke@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      5 months ago

      If you genuinely think LLMs are anyway capable of even basic reasoning despite all arguments towards the contrary, I honestly don’t want to try convincing you anymore. You’re asking for a miracle out of me—to explain consciousness itself, even—while you can just say “but there’s a chance” even though LLMs can’t get basic facts right.