Robin Williams’ Daughter Speaks Out Against AI Recreations Of Actors’ Voices: “I Find It Personally Disturbing”::Robin Williams’ daughter speaks out against AI recreations of actors’ voices, explaining her own “disturbing” experiences with the controversial tech.

  • foggy@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    1
    ·
    9 months ago

    It is troublesome but I feel like the problem is intractable.

    • query@lemmy.world
      link
      fedilink
      English
      arrow-up
      67
      ·
      edit-2
      9 months ago

      For personal use, but corporations trying to profit off of it could be fined 100% of their assets if need be.

      • foggy@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        9 months ago

        I’m down with that.

        But it won’t stop it from spreading like wildfire.

        I envision websites that openly allow you to do stuff like this with fine print disclaimers about trying to profit off that content.

      • bstix@feddit.dk
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        3
        ·
        9 months ago

        Proving it happening is going to be difficult…more difficult than telling Vanilla Ice from Queen/Bowie.

        There is currently no way to copyright a voice. There is also no good definition of what a voice even is legally.

        It has always been happening to some degree. Casters would cast someone who sound like that other person etc. Happens to music all the time. Tv shows have cover music that is soo close to the song it’s supposed to remind the viewer of but just isn’t technically. Voices are going to much more difficult, especially with AI spitting out audio of what it would have sounded like if a certain person had said it, but didn’t. It’ll be impossible.

        This will be the death of voice acting.

          • phx@lemmy.ca
            link
            fedilink
            English
            arrow-up
            8
            ·
            9 months ago

            Yeah, and as for the “there’s no way to tell”

            Well when somebody is selling shit like “Robin Williams Alexa Nest voice $19.99 GPS narration” or bullshit like that, it’ll be pretty obvious.

            A voice that sounds kinda “like Robin Williams” without being outright stated as such - or using his voice and others as a seed for AI generated output - might be a lot more difficult to pin down.

            US corporations are all salivating at how they can cut out workers using technology and make record profits, but it’ll be a quick turnaround from every other country full of sellers who don’t give a fuck about copyright undercutting them on the market. Once that happens, they’ll be clamoring for “better regulation” and it’ll be too late.

        • tony@lemmy.hoyle.me.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          Well to the last, yes definitely… it’ll become worthless as a skill. Once you can make an AI that speaks any line perfectly you don’t need anyone special to do it… joe shmoe off the street would be just as good a ‘voice actor’ as the current professionals. There’s obviously skill in making the AI do that seamlessly but that’s a different job.

          Translation went through something similar. It used to be something you paid someone a lot of money for now you just type the sentence into your phone.

        • three@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          9 months ago

          if i write a bunch of words i can’t be wrong

      • Andy@slrpnk.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 months ago

        I like this. I don’t mind the use itself for limited purposes, I just don’t think there should be any financial benefit to it.

  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    16
    ·
    9 months ago

    It sucks, but it is the price of selling one’s self to the public spotlight.

    Much like internet culture has taken decades to develop some basic sense of civility, it will take a long time for these tools to normalize with a morality culture of some kind. I figure, the best course is always treat others like you want to be treated. Most people are not trying to hurt anyone, they just haven’t thought through the consequences of what they are doing.

    What we really need in the short term is a path of least resistance to make AI characters and tools that provide an easier way to create projects, demonstrations, and art. It is the availability of media containing public figures that is primarily driving this kind of thing. If it was easier to find and use a digital alternative entity, I think most people would use it as an alternative.

    One thing I have tried is to use an amalgamated creation of people that are accessible in the dataset, (PersonA:PersonB). If done well, the final results can be consistent and unrecognizable as either person.

        • SnowdenHeroOfOurTime@unilem.org
          link
          fedilink
          English
          arrow-up
          11
          ·
          9 months ago

          I didn’t read this novel but what I read in your top level comment is borderline unhinged.

          No, just because people can emulate people doesn’t make it right. And no this way of thinking doesn’t lead to slavery. That’s an outright stupid thing to say.

  • RobotToaster@mander.xyz
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    9 months ago

    The response we are seeing from actors now is very similar to what happened when video recording was introduced.

    One of the reasons so many old TV shows like Dr Who are lost is because of rules actors forced requiring their fees be paid a second time after a performance being shown a certain number of times. The broadcasters ended up just destroying the tapes after broadcasting it.

    Whatever happens I just hope nothing is lost this time.