• cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      ·
      7 months ago

      It’s worth more, more often than you think, it’s just hawked by random “influencers” with a paltry 50k followers or some shit. That kind of exposure is worthless.

      I’ve heard from photographers and artists before that they will consider exposure offers, but only if you have an actual status. Like a major brand or near-celebrity or a top 50 social media “influencer” with millions upon millions of followers.

      And they usually want a contract, i.e. you have to pin our work for X days and you have to tag us etc.

    • Jesus@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      edit-2
      7 months ago

      If you look at the announcement, they’re pretty damn boxed in. They can’t scrap the local device, or iCloud. Open AI only gets queries that the dumber Apple models thinks would be better served by OpenAI. And each of those queries is prompted with a dialog that says “Do you want me to use ChatGPT to do that? Cancel / Use ChatGPT”

      That said, on stage, Apple briefly mentioned that ChatGPT plus users would have more functionality. I’ll bet money that’s the real play. LLM model subscriptions in the App Store. Apple loves that sweet sweet AppStore and subscription money.

      Question is, do they take a cut like with Spotify, or is basic, free, GPT 4 access payment enough?

    • XiozTzu@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      12
      ·
      7 months ago

      What data? The data that the user affirmatively agrees to send them that is anonymized? That data?

      • NotAnotherLemmyUser@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        3
        ·
        7 months ago

        I’m sure you understand this, but anonymized data doesn’t mean it can’t be deanonymized. Given the right kind of data, or enough context they can figure out who you are fairly quickly.

        Ex: You could “Anonymize” gps traces, but it would still show the house you live at and where you work unless you strip out a lot of the info.

        http://androidpolice.com/strava-heatmaps-location-identity-doxxing-problem/

        Now with LLMs, sure, you could “anonymize” which user said or asked for what… but if something identifying is sent in the request itself, it won’t be hard to deanonymize that data.

        • Zos_Kia@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          7 months ago

          I don’t know about the US but in European GDPR parlance, of it can be reversed then it is NOT anonymized and it is illegal to claim otherwise. The correct term is pseudonymized.

        • XiozTzu@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          11
          ·
          7 months ago

          So you would rather submit your non-anonymized data? Because those bastards will find a way to unanonimize it. Is Apple doing the right thing or not?

          • NotAnotherLemmyUser@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            7 months ago

            What? No. I would rather use my own local LLM where the data never leaves my device. And if I had to submit anything to ChatGPT I would want it anonymized as much as possible.

            Is Apple doing the right thing? Hard to say, any answer here will just be an opinion. There are pros and cons to this decision and that’s up to the end user to decide if the benefits of using ChatGPT are worth the cost of their data. I can see some useful use cases for this tech, and I don’t blame Apple for wanting to strike while the iron is hot.

            There’s not much you can really do to strip out identifying data from prompts/requests made to ChatGPT. Any anonymization of that part of the data is on OpenAI to handle.
            Apple can obfuscate which user is asking for what as well as specific location data, but if I’m using the LLM and I tell it to write up a report while including my full name in my prompt/request… that’s all going directly into OpenAIs servers and logs which they can eventually use to help refine/retrain their model at some point.

              • farcaller@fstab.sh
                link
                fedilink
                English
                arrow-up
                8
                ·
                7 months ago

                IIRC they demonstrated an interaction with Siri where it asks the user for consent before enriching the data through chatgpt. So yeah, that seems to mean your data is sent out (if you consent).

              • NotAnotherLemmyUser@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                2
                ·
                7 months ago

                I’d say the proof is on Apple to show that it’s being done on-device or that all processing is done on iCloud servers.

                You’re saying that OpenAI is just going to hand over their full ChatGPT model for Apple to set up on their own servers for free?

                But from the article itself:

                the partnership could burn extra money for OpenAI, because it pays Microsoft to host ChatGPT’s capabilities on its Azure cloud

                I get it if they created a small version of their LLM to run locally, but I would expect Apple to pay a price even for that.

                I think you may be confusing this ChatGPT integration with Apple’s own LLM that they’re working on… Again, from the linked article:

                Still, Apple’s choice of ChatGPT as Apple’s first external AI integration has led to widespread misunderstanding, especially since Apple buried the lede about its own in-house LLM technology that powers its new “Apple Intelligence” platform.

      • doodledup@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        7 months ago

        The point is that they can use that data for further training. They want to build a monopoly like Google is for search.

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          7 months ago

          They want to build a monopoly like Google is for search.

          There’s Bing, and some others. I’m using Kagi. You can pretty much drop one in for another.

          Google has a significant amount of marketshare, but it doesn’t really have the ability to determine the terms on which a consumer can get access to search services, which is what lets a monopoly be a monopoly.

          They’ve got a monopoly over providing some services to Android users, maybe.

        • XiozTzu@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          4
          ·
          7 months ago

          Like Google did with user queries and crawling data. I’m just saying everyone is happily giving these companies data. You are welcome to not use the GPT functionality just like you are welcome to use DuckDuckGo. I’m not getting the hostility to Apple. Microsoft on the other hand…

  • Jesus@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    7 months ago

    I’ll take bets here.

    I don’t think the user data where OpenAI makes its money. It’s the $20 a month GPT Plus subscriptions.

    Apple announced that Plus users would get more functionality. Also, OpenAI is basically only limited to collecting data from queries that the user explicitly says it wants to sent to a 3rd party model. Each GPT4 query prompts the user with “Do you want me to use ChatGPT to do that?”

    Apple’s not really in the business of selling data, but they are famously and infamously in the business of selling subscriptions to shit.

    • Petter1@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      And this is the way it has to be! Fuck all this “free” stuff paid by data and ads!

      • Jesus@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        Google: tries to track your behavior so they can sell targeted ads in their freeware

        Apple: aggressively tried to lock you into platforms and ecosystem so they can sell you subscriptions, apps, and hardware

        Both shitty, but one company’s business model is much more dependent on user data to exist.

    • narc0tic_bird@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      Plus users do get more functionality, but it sounded like these are just the extra features a Plus subscription gives you anyway (so analysis, image generation etc.).

  • Kokesh@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    7 months ago

    How will this be financially viable for OpenAI? It costs lots of money to run this crap

    • Jesus@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 months ago

      GPT Plus subscriptions.

      During the WWDC keynote they mentioned that GPT Plus users would get additional functionality in iOS.

    • Jesus@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      7 months ago

      Every query prompted with a consent alert, and OpenAI isn’t allowed to rummage around through iOS, MacOS or iCloud. If the more private, but dumber, Apple model can’t do the job, you get this alert.

      My guess is that, since the data collected is from GPT 4 queries, and requires expensive compute, the business move is actually around subscriptions. The data probably isn’t worth the expense query compute.

      Apple announced that GPT Plus users would have more functionality, and if there is one thing Apple is exceedingly good at, it’s selling subscriptions to shit.

    • webghost0101@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      7 months ago

      And yet if you look at the technical details, between Google, Apple n Microsoft. This does seem to be the lesser evil for privacy.

      Of course if you have like a degoogled fairphone il bow down to you but if you also have a windows pc you’re screwed either way.

  • Vincente@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    4
    ·
    edit-2
    7 months ago

    No, the payment is apple users’ data, is me. If I can’t turn it off completely, I won’t update the OS.

    • 8ender@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      7 months ago

      Chill and do some reading. It’s not only opt-in but can be disabled at any time, and it’s opt-in per request. It’ll tell you before anything goes to ChatGPT and even then it’s anonymized.

      • Vincente@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        7 months ago

        It is not absolutely safe and anonymous. If I can’t just turn it off completely, it’s unacceptable for me.