• blazeknave@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    9
    ·
    7 months ago

    Ty. As soon as I saw the headline, I knew I wouldn’t be finding value in the article.

    • ALostInquirer@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      4
      ·
      7 months ago

      It’s not a bad article, honestly, I’m just tired of journalists and academics echoing the language of businesses and their marketing. “Hallucinations” aren’t accurate for this form of AI. These are sophisticated generative text tools, and in my opinion lack any qualities that justify all this fluff terminology personifying them.

      Also frankly, I think students have one of the better applications for large-language model AIs than many adults, even those trying to deploy them. Students are using them to do their homework, to generate their papers, exactly one of the basic points of them. Too many adults are acting like these tools should be used in their present form as research aids, but the entire generative basis of them undermines their reliability for this. It’s trying to use the wrong tool for the job.

      You don’t want any of the generative capacities of a large-language model AI for research help, you’d instead want whatever text-processing it may be able to do to assemble and provide accurate output.