AI girlfriend bots are already flooding OpenAI’s GPT store::OpenAI’s store rules are already being broken, illustrating that regulating GPTs could be hard to control

  • HelloHotel@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    6 months ago

    Its a hard question to answer, there is a good reason but its sevral pargraphs long and i likely have gaps in knolage and in some places misguided. The reduced idea: being emotionally open (no emotional guarding or sandboxing/RPing) with a creature that lacks many traits required to take on that responsability. the model is being pretrained to perform jestures that make us happy, having no internal state to ask itself if it would enjoy garlic bread given its experience with garlic. its an advanced tape recorder, being pre-populated with an answer. Or it lies and picks somthing because saying idk is the wrong response. As apposed to a creature that has some kind of consistant external world and a memory system. firehosing it with data, means less room for artistic intent.

    If your sandboxing/Roleplaying, theres no problem.