The BBC has issued a statement that offers important context to Sara Poyzer’s viral social media posts. The British broadcaster said it is using AI technology in a “highly sensitive documentary” to represent the voice of a person who is nearing the end of their life.

Poyzer was penciled in for the job, but her services are no longer required as the BBC attempts to honor the wishes of the contributor’s family by dedicating a brief — and clearly signposted — section of the documentary to recreating “a voice which can now no longer be heard.”

  • Rentlar@lemmy.ca
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    1
    ·
    9 months ago

    In this specific case I am okay with and happy with AI being used this way since the subject being impersonated themself wished to be voiced by the AI. I can totally see why, it’s like technological magic that “gives your voice back” and you can sound like yourself from any point in your recorded lifetime you like!

    For deceased people I don’t think AI should be used to put words into people’s mouths for commercial purposes without their permission. When AI gets good enough, why hire new actors for a movie when you can just reanimate Michael Jackson forever? Hee!

    • thehatfox@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      9 months ago

      In the AI age we are going to need some way to have lasting likeness rights, in life and death.

      There should be some sort of protection against having a persons appearance hijacked, especially for commercial purposes.

      • Sneezycat@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        11
        ·
        9 months ago

        I don’t think Che Guevara gave permission to use his image for trendy shirts. Is it that different? They’re using the image of a dead person to sell a product.

        We need better protection against companies in general.