- 1 Post
- 5 Comments
HighlyRegardedArtist@lemmy.worldto
Lemmy Shitpost@lemmy.world•How to clean a rescued pigeon
4·1 year agoYou can play with words all you like, but that’s not going to change the fact that LLMs fail at reasoning. See this Wired article, for example.
HighlyRegardedArtist@lemmy.worldto
Lemmy Shitpost@lemmy.world•How to clean a rescued pigeon
32·1 year agoI have to disagree with that. To quote the comment I replied to:
AI figured the “rescued” part was either a mistake or that the person wanted to eat a bird they rescued
Where’s the “turn of phrase” in this, lol? It could hardly read any more clearly that they assume this “AI” can “figure” stuff out, which is simply false for LLMs. I’m not trying to attack anyone here, but spreading misinformation is not ok.
HighlyRegardedArtist@lemmy.worldto
Lemmy Shitpost@lemmy.world•How to clean a rescued pigeon
207·1 year agoOr, hear me out, there was NO figuring of any kind, just some magic LLM autocomplete bullshit. How hard is this to understand?
This is one of the reasons I’ve disabled uefi by default with the
noefikernel parameter, the other reason being the LogoFAIL exploit: https://wiki.archlinux.org/title/Unified_Extensible_Firmware_Interface#Disable_UEFI_variable_access


Ok, great to know. Nuance doesn’t cross internet well, so your intention wasn’t clear, given all the uninformed hype & grifters around AI. Being somewhat blunt helps getting the intended point across better. ;)