• Excrubulent@slrpnk.net
    link
    fedilink
    English
    arrow-up
    4
    ·
    19 hours ago

    And anytime I see anyone advocating this crap it’s always because it gets the job done “faster”, and like, the rule is: “fast; cheap; good; pick two”, and this doesn’t break that rule.

    Yeah, they get it done super fast, and super shitty. I’m yet to see anyone explain how an LLM gets the job done better, not even the most rabid apologists.

    LLMs have zero fidelity, and information without fidelity is just noise. It is not good at doing information work. In fact, I don’t see how you get information with fidelity without a person in the loop, like on a fundamental, philosophical level I don’t think it’s possible. Fidelity requires truth, which requires meaning, and I don’t think you get a machine that understands meaning without AGI.