People love to hate AI, but I think it’s one of the most human inventions ever. The majority of my internal human experience already runs on autopilot.

The life-critical tasks have been outsourced to various biological systems.

My heart beats 24/7 without conscious effort, thanks to the cardiovascular system. Digestion? Handled seamlessly by another system. Breathing? Autopilot. I don’t have to remind myself to inhale and exhale. It just happens.

Even many of my own thoughts seem to appear out of nowhere—emerging from my subconscious or triggered by something around me.

Is it any wonder that, in one way or another, all human technologies strive to replicate this internal ‘automation’ in the external world?

To me, it’s a beautiful—if ultimately futile—attempt to harmonize our inner and outer realities.

      • aCosmicWave@lemm.eeOP
        link
        fedilink
        arrow-up
        2
        ·
        24 hours ago

        Sorry and fixed! FYI my light mode is on during the day and off at night. I’m not a total monster!

  • 5714@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    13
    ·
    1 day ago

    Yes, I also love infrastructure, but current AI is not remotely reliable enough to be called automation of thinking.

    • aCosmicWave@lemm.eeOP
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      1 day ago

      You’re absolutely right, I don’t definitely don’t think that we are there!

      Although I do believe that humanity has always trended this way—starting with sitting on rocks, then shaping trees to fit the contours of our physical bodies as chairs. Now, we’re trying to shape abstract knowledge and “thoughts” to fit the contours of our individual minds for similar reasons.

      • 5714@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        2
        ·
        24 hours ago

        Nah.

        A proper AGI outcompetes humanity big time.

        Do you see any other hominids around? I am but a layperson, but AFAIK, a matter-manipulating AGI would shrink our niche.

        I don’t think humanity is able to halt this evolution.

        This mustn’t be a bad thing. Maybe we get reservates.

  • morrowind@lemmy.ml
    link
    fedilink
    arrow-up
    11
    ·
    1 day ago

    No that would be a regular computer.

    Your heart is not intelligent, infact the closest analogue would be a… pacemaker

    The closest analogue to AI is your brain

  • EfreetSK@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    1 day ago

    In my opinion yes, unfortunately. It’ll suffer from Gartner hype cycle soon but it’ll recover and will slowly get better every year to the point it’s really good.

    The worst thing is that I don’t see any “stop sign”. Like f.e. with self driving cars it was kind of obvious that it’ll get ridiculously complex in real life situations, thus having a problem with legislation and mass adoption. But with AI? I don’t know, I don’t see any stop sign … Maybe that it never reaches this high mark we all expect?

    • wizardbeard@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      48 minutes ago

      But with AI? I don’t know, I don’t see any stop sign … Maybe that it never reaches this high mark we all expect?

      I personally think that’s the most likely outcome. Most of the advances lately rely on effectively “brute forcing” the problem space by shoving more training data in and by using more resources to calculate weights. There are minor improvements here and there by combining approaches, but development of new techniques has largely slowed to a crawl.

      There’s also still no clear path for any of this tech to make the massive leap from “trained for a purpose” to generalized knowledge, which is the most pointed to “selling point” for the whole idea.

      And all of that is ignoring the fact that OpenAI, the biggest name in the space, operates at a considerable loss. They only still exist because Microsoft can afford to burn the equivalent of a small country’s GDP on the small chance they get to be an industry leader on this. The resource, money, and energy investment for the current results are so absurdly mismatched that unless something huge manages to shake things up, I have a very hard time seeing it ever reach the heights the hype machine has been prophecizing.

      Machine Learning is amazing, has been improving all sorts of things for multiple decades, and will continue to do so long after this current overhyped idea of AI fades away. The current glorified chat bots, generative AI stuff? I think we’re already well past the point of reasonable ROI in terms of resources.