A report from Morgan Stanley suggests the datacenter industry is on track to emit 2.5 billion tons by 2030, which is three times higher than the predictions if generative AI had not come into play.

The extra demand from GenAI will reportedly lead to a rise in emissions from 200 million tons this year to 600 million tons by 2030, thanks largely to the construction of more data centers to keep up with the demand for cloud services.

  • MyOpinion@lemm.ee
    link
    fedilink
    arrow-up
    52
    arrow-down
    3
    ·
    2 months ago

    Looks like AI will eliminate any gains we have in climate change. Too bad for us.

      • Rhaedas@fedia.io
        link
        fedilink
        arrow-up
        13
        arrow-down
        3
        ·
        2 months ago

        Over time, it is. It’s eliminating the source. In Terminator, Matrix, and others they say that the AI took a split second to act, but our AI doesn’t have those connections. It’s working with what it’s got.

        • theshatterstone54@feddit.uk
          link
          fedilink
          arrow-up
          8
          arrow-down
          1
          ·
          2 months ago

          It’s scary how our corporate overlords watch all the distopian films and are like: Let’s turn that into a reality!

    • akwd169@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      2 months ago

      Too bad for us, yeah.

      The people benefiting from this have more than enough money to stay comfortable and live a long life if the environment becomes hazardous, deadly and inimical to human life.

      It’s the masses who will suffer, who will be forced to live in tiny bunkers just to survive and work for the capitalists, never going outside until an extreme weather event wipes them away despite their budget bunker

      And the capitalist ruling class will say “oh look more space to expand my summer bunker”

    • ms.lane@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      2 months ago

      That already happened with Crypto.

      AI will use less power over time, as hardware gets faster and we approach a ‘good enough’ level of computation power, similar to Desktops/Laptops - outside gaming, electrical power for the average desktop has only decreased since Sandy Bridge, a 2600K is still good enough for the average desktop, it can still even pull punches gaming.

      Crypto, by design, will never decrease in power use and only and forever increase.