• KillingTimeItself@lemmy.dbzer0.com
    cake
    link
    fedilink
    English
    arrow-up
    32
    ·
    7 months ago

    it’s only going to get worse, especially as datasets deteriorate.

    With things like reddit being overrun by AI, and also selling AI training data, i can only imagine what mess that’s going to cause.

    • Cyberflunk@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 months ago

      Hallucinations, like depression, is a multifaceted issue. Training data is only a piece of it. Quantized models, overfitted training models rely on memory at the cost of obviously correct training data. Poorly structured Inferences can confuse a model.

      Rest assured, this isn’t just training data.

    • vegetal@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 months ago

      I think you are spot on. I tend to think the problems may begin to outnumber the potentials.

      • KillingTimeItself@lemmy.dbzer0.com
        cake
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 months ago

        and we haven’t even gotten into the problem of what happens when you have no more data to feed it, do you make more? That’s an impossible task.