Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.

Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.

Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.

Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards. Trying to achieve one climate goal of limiting our dependence on fossil fuels can compromise another goal, of ensuring everyone has a safe and accessible water supply.

Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.

In other words, policy needs to be designed not to pick sectors or technologies as “winners”, but to pick the willing by providing support that is conditional on companies moving in the right direction. Making disclosure of environmental practices and impacts a condition for government support could ensure greater transparency and accountability.

  • andrew_bidlaw@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    5 months ago

    It isn’t resource efficient, simple as that. Machine learning isn’t something new and it indeed was used for decades in one form or another. But here is the thing: when you train a model to do one task good, you can approximate learning time and the quality of it’s data analyzis, say, automating the process of setting price you charge for your hotel appartments to maximize sales and profits. When you don’t even know what it can do, and you don’t even use a bit of it’s potential, when your learning material is whatever you was dare to scrap and resources aren’t a question, well, you dance and jump over the fire in the bank’s vault. LLM of ChatGPT variety doesn’t have a purpose or a problem to solve, we come with them after the fact, and although it’s thrilling to explore what else it can do, it’s a giant waste*. Remember blockchain and how everyone was trying to put it somewhere? LLMs are the same. There are niche uses that would evolve or stay as they are completely out of picture, while hyped up examples would grow old and die off unless they find their place to be. And, currently, there’s no application in which I can bet my life on LLM’s output. Cheers on you if you found where to put it to work as I haven’t and grown irritated over seeing this buzzword everywhere.

    * What I find the most annoying with them, is that they are natural monopolies coming from the resources you need to train them to the Bard\Bing level. If they’d get inserted into every field in a decade, it means the LLM providers would have power over everything. Russian Kandinsky AI stopped to show Putin and war in the bad light, for example, OpenAI’s chatbot may soon stop to draw Sam Altman getting pegged by a shy time-traveler Mikuru Asahina, and what if there would be other inobvious cases where the provider of a service just decides to exclude X from the output, like flags or mentions of Palestine or Israel? If you aren’t big enough to train a model for your needs yourself, you come under their reign.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      5 months ago

      That is a good argument, they are natural monopolies due to the resources they need to be competitive.

      Now do we apply this elsewhere in life? Is anyone calling for Boeing to be broken up or Microsoft to be broken up or Amazon to be broken up or Facebook?