The new global study, in partnership with The Upwork Research Institute, interviewed 2,500 global C-suite executives, full-time employees and freelancers. Results show that the optimistic expectations about AI’s impact are not aligning with the reality faced by many employees. The study identifies a disconnect between the high expectations of managers and the actual experiences of employees using AI.

Despite 96% of C-suite executives expecting AI to boost productivity, the study reveals that, 77% of employees using AI say it has added to their workload and created challenges in achieving the expected productivity gains. Not only is AI increasing the workloads of full-time employees, it’s hampering productivity and contributing to employee burnout.

  • Flying Squid@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    edit-2
    5 months ago

    What New Yorker article? You didn’t link to one. I, however, linked to Yale University which has a slightly better track record on science than The New Yorker.

    And, again, you are arguing that emitting less CO2 is a good thing. It is not.

    And if water can be a big issue, why is AI a good thing when it uses it up? You can say “people shouldn’t build data centers in those locations,” but they are. And the world doesn’t run on “shouldn’t.”

    Edit: Now you linked to it. It’s paywalled, which means I can’t read it and I doubt you did either.

    • Womble@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      Whole article for ref since you cant access it for whatever reason (its not very nice assuming bad faith like that btw)

      In 2016, Alex de Vries read somewhere that a single bitcoin transaction consumes as much energy as the average American household uses in a day. At the time, de Vries, who is Dutch, was working at a consulting firm. In his spare time, he wrote a blog, called Digiconomist, about the risks of investing in cryptocurrency. He found the energy-use figure disturbing.

      “I was, like, O.K., that’s a massive amount, and why is no one talking about it?” he told me recently over Zoom. “I tried to look up some data, but I couldn’t really find anything.” De Vries, then twenty-seven, decided that he would have to come up with the information himself. He put together what he called the Bitcoin Energy Consumption Index, and posted it on Digiconomist. According to the index’s latest figures, bitcoin mining now consumes a hundred and forty-five billion kilowatt-hours of electricity per year, which is more than is used by the entire nation of the Netherlands, and producing that electricity results in eighty-one million tons of CO2, which is more than the annual emissions of a nation like Morocco. De Vries subsequently began to track the electronic waste produced by bitcoin mining—an iPhone’s worth for every transaction—and its water use—which is something like two trillion litres per year. (The water goes toward cooling the servers used in mining, and the e-waste is produced by servers that have become out of date.)

      Last year, de Vries became concerned about another energy hog: A.I. “I saw that it has a similar capability, and also the potential to have a similar growth trajectory in the coming years, and I felt immediately prompted to make sure people are aware that this is also energy-intensive technology,” he explained. He added a new tab to his blog: “AI sustainability.” In a paper he published last fall, in Joule, a journal devoted to sustainable energy, de Vries, who now works for the Netherlands’ central bank, estimated that if Google were to integrate generative A.I. into every search, its electricity use would rise to something like twenty-nine billion kilowatt-hours per year. This is more than is consumed by many countries, including Kenya, Guatemala, and Croatia.

      “There’s a fundamental mismatch between this technology and environmental sustainability,” de Vries said. Recently, the world’s most prominent A.I. cheerleader, Sam Altman, the C.E.O. of OpenAI, voiced similar concerns, albeit with a different spin. “I think we still don’t appreciate the energy needs of this technology,” Altman said at a public appearance in Davos. He didn’t see how these needs could be met, he went on, “without a breakthrough.” He added, “We need fusion or we need, like, radically cheaper solar plus storage, or something, at massive scale—like, a scale that no one is really planning for.”

      Video From The New Yorker

      What a Mammal’s Loss Teaches Us About Mortality: Requiem for a Whale

      Last week, the International Energy Agency announced that energy-related global CO2 emissions rose, yet again, in 2023, to more than thirty-seven billion metric tons. The increase comes at a time when the whole world is supposedly striving to reach net-zero emissions, and it indicates that global efforts are, to put it mildly, falling short. Much of the increase in emissions came from China, and most of it was driven by century-old technologies, such as the internal-combustion engine. So data centers are, for now at least, a small part of the problem. Still, as the use of A.I. ramps up and bitcoin prices reach new heights, the question is: How can the world reach net zero if it keeps inventing new ways to consume energy? (In the U.S., data centers now account for about four per cent of electricity consumption, and that figure is expected to climb to six per cent by 2026.)

      Mining cryptocurrencies like bitcoin eats up electricity owing to the way the system was set up. To acquire bitcoin (and other currencies that rely on a similar scheme), miners compete to answer cryptographic riddles. Winning the competition takes a lot of computing power. As a result, server farms devoted to crypto mining tend to be situated in parts of the world where electricity is cheap. China used to lead the world in crypto mining, but it imposed a ban on the practice in 2021, and now the U.S. is No. 1. A few months ago, the U.S. Department of Energy tried to compel mining concerns to report their energy use, but in February a Texas judge issued a temporary restraining order blocking the effort. (According to the White House Office of Science and Technology Policy, crypto mining in the U.S. uses almost as much energy as all the nation’s home computers combined.) Meanwhile, the higher the price of bitcoin rises—it reached a record of sixty-nine thousand dollars on March 5th—the bigger the financial incentives for mining it, and the more energy consumed.

      Artificial intelligence requires a lot of power for much the same reason. The kind of machine learning that produced ChatGPT relies on models that process fantastic amounts of information, and every bit of processing takes energy. When ChatGPT spits out information (or writes someone’s high-school essay), that, too, requires a lot of processing. It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.)

      A.I. could potentially be used to alleviate some of the problems it is exacerbating. For instance, it might be used to improve the efficiency of renewable-energy systems, which could reduce emissions from server farms. But it seems unlikely that such gains will keep up with A.I.’s growing electricity demands; this, presumably, is why Altman argues that a technological breakthrough is needed.

      De Vries, for his part, is dismayed by what he sees as a lack of human learning in the face of so much machine learning. “I think the only thing that’s realistic in terms of policy, at least in the short to medium term, is disclosure requirements,” he said. “It’s taken a very long time before we got there with regard to cryptocurrencies, and I’m disappointed that we haven’t gotten there sooner with A.I. It’s like we saw what cryptocurrency mining could do, and we totally forgot about it.” ♦

      • Flying Squid@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        edit-2
        5 months ago

        Your link is just about Google’s energy use, still says it uses a vast amount of energy, and says that A.I. is partially responsible for climate change.

        It even quotes that moron Altman saying that there’s not enough energy to meet their needs and something new needs to be developed.

        I have no idea why you think this supports your point at all.

        • Womble@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 months ago

          Artificial intelligence requires a lot of power for much the same reason. The kind of machine learning that produced ChatGPT relies on models that process fantastic amounts of information, and every bit of processing takes energy. When ChatGPT spits out information (or writes someone’s high-school essay), that, too, requires a lot of processing. It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.)

          That was the only bit I was referring to for a source for 0.5GWh energy usage per day for GPT, I agree what Altman says is worthless, or worse deliberately manipulative to keep the VC money flowing into openAI.

          • Flying Squid@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            5 months ago

            I see, so if we ignore the rest of the article entirely, your point is supported. What an odd way of trying to prove a point.

            Also, I guess this was a lie:

            Ok, you just want to shout not discuss so I wont engage any further.

            Although since it was a lie, I’d love you to tell me what you think I was shouting about.

    • Womble@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      5 months ago

      Apologies, I didn’t post the link, it’s edited now.

      If you want to take issue with all energy usage that’s fine, its a position to take. But it’s quite a fringe one given that harnessing energy is what gives us the quality of life we have. Thankfully electricity is one of the easiest forms of energy to decarbonise and is already happening rapidly with solar and wind power, we need to transition more of our energy usage to it in order to reduce fossil fuel usage. My main point is that this railing against AI energy usage is akin to the whole plastic straw ban, mostly performative and distracting from the places where truely vast amounts of fossil fuels are burnt that need to be tackled urgently.

      You can say “people shouldn’t build data centres in those locations,” but they are. And the world doesn’t run on “shouldn’t.”

      I’m 100% behind forcing data centres to use sustainable water sources or other methods of cooling. But that is a far cry from AI energy consumption being a major threat, the vast majority of data centre usage isn’t AI anyway, it’s serving websites like the one we are talking on right now.

      • rekorse@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        Why can’t we analyze AI on its own merits? We dont base our decisions on whether an idea is more or less polluting than automobiles. We can look at what we are getting for what’s being put into it.

        The big tech companies could scrap their AI tech today and it wouldnt change most peoples lives.

      • Flying Squid@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        5 months ago

        Apologies, I didn’t post the link, it’s edited now.

        Yes, and it’s paywalled, so I can’t read it. I think you knew that. It could say anything.

        I’m 100% behind forcing data centre’s to use sustainable water sources or other methods of cooling.

        Cool, good luck with that happening.

        But that is a far cry from AI energy consumption being a major threat,

        A different subject from water. You keep trying to get away from the water issue. I also think you know why you’re doing that.

        Also, define threat. It contributes to climate change. It gets rid of potable water. I’d call that a threat.

        By the way, there is nowhere in the U.S. where water is not going to be a problem soon.

        https://geographical.co.uk/science-environment/us-groundwater-reserves-being-depleted-at-alarming-rate

        But hey, we can just move the servers to the ocean, right? Or maybe outer space! It’s cold!

        • Womble@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          5 months ago

          Ok, you just want to shout not discuss so I wont engage any further.

          • Flying Squid@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            edit-2
            5 months ago

            That’s a nice cop-out there since nothing I said could remotely be considered shouting and your New Yorker article in no way supported your point.