• 0 Posts
  • 47 Comments
Joined 1 year ago
cake
Cake day: July 7th, 2023

help-circle




  • maniclucky@lemmy.worldtoMemes@lemmy.mlAI bros
    link
    fedilink
    arrow-up
    6
    ·
    1 month ago

    Absolutely. It’s why asking it for facts is inherently bad. It can’t retain information, it is trained to give output shaped like an answer. It’s pretty good at things that don’t have a specific answer (I’ll never write another cover letter thank blob).

    Now, if someone were to have the good sense to have some kind of lookup to inject correct information between the prompt and the output, we’d be cooking with gas. But that’s really human labor intensive and all the tech bros are trying to avoid that.


  • maniclucky@lemmy.worldtoMemes@lemmy.mlAI bros
    link
    fedilink
    arrow-up
    8
    arrow-down
    3
    ·
    1 month ago

    Gradient descent is a common algorithm in machine learning (AI* is a subset of machine learning algorithms). It refers to using math to determine how wrong an answer is in a particular direction and adjusting the algorithm to be less wrong using that information.







  • I’ve seen this comment before. My counter: can you assure me that, for example, a new homeowner that doesn’t know better won’t disturb the scale? They won’t have a leaky faucet and mess with the pipes? Or something like Flint doesn’t happen ever again where necessary infrastructure changes necessitate disturbing the scale?

    This ‘solution’ only ‘works’ if you leave it completely alone and never touch it. So don’t get new appliances, never have a plumber fix some things, never update that water main that’s gonna break down any time now. It’s a very short sighted ‘solution’ to the problem. I’d hazard it’s a good argument for triage. Cities that need new infrastructure anyway go first kind of thing. But fobbing it off as ‘its fine’ isn’t ok.