• 0 Posts
  • 3 Comments
Joined 2 years ago
cake
Cake day: October 7th, 2023

help-circle


  • jfrnz@lemm.eetoLemmy Shitpost@lemmy.worldAI Training Slop
    link
    fedilink
    arrow-up
    4
    arrow-down
    3
    ·
    7 days ago

    Running a 500W GPU 24/7 for a full year is less than a quarter of the energy consumed by the average automobile in the US (in 2000). I don’t know how many GPUs this person has or how long it took to fine tune the model, but it’s clearly not creating an ecological disaster. Please understand there is a huge difference between the power consumed by companies training cutting-edge models at massive scale/speed, compared to a locally deployed model doing only fine tuning and inferencing.