• 1 Post
  • 39 Comments
Joined 2 years ago
cake
Cake day: June 13th, 2023

help-circle
  • When I was in my 20:ies, I had my alarm clock at the other side of the room and still managed to oversleep. I ended up having it under my bed close to the wall, so I had to crawl in under my bed (quite narrow space) pressing my body to the cold floor to turn off the alarm… and I never managed to turn that off in my sleep. But I would have preferred a wake up call… so, even though I have never used the service when staying at a hotel, I can see why some people use it.




  • Well, that was something… I have used ligatures in my code editor for quite a few years now, and I have NEVER been confused about the ambiguity this person is so upset about. Why? I have never ever seen the Unicode character for not equals in a code block, simply since it is not a valid character in any known language. In fact, I have never even seen it in a String where it actually would be legal, probably since nobody knows how to type that using a standard keyboard. This whole article felt like someone with a severe diagnose have locked in on some hypothetical correctness issue, that simply isn’t a problem in the real world.

    But, if you for some reason find ligatures confusing, then you shouldn’t use them. But, just to be clear, there is not a right of wrong like this blog post tries to argue, it is a matter of personal taste.




  • The problem is that C is a prehistoric language and don’t have any of the complex types for example. So, in a modern language you create a String. That string will have a length, and some well defined properties (like encoding and such). With C you have a char * , which is just a pointer to the memory that contains bytes, and hopefully is null terminated. The null termination is defined, but not enforced. Any encoding is whatever the developer had in mind. So the compiler just don’t have the information to make any decisions. In rust you know exactly how long something lives, if something try to use it after that, the compiler can tell you. With C, all lifetimes lives in the developers head, and the compiler have no way of knowing. So, all these typing and properties of modern languages, are basically the implementation of your suggestion.






  • I actually asked chatGPT about a specific issue I had and solved a while back. It was one of these issues where it looked like a simple naive solution would be sufficient, but due to different conditions that fails, you have to go with a more complex solution. So, I asked about this to see what it would answer. And it went with the simpler solution, but with some adjustments. The code also didn’t compile. But it looked interesting enough, for me to question my self. Maybe it was just me that failed the simpler solution, so I actually tried to fix the compile errors to see if I could get it working. But the more I tried to fix its code the more obvious it got that it didn’t have a clue about what it was doing. However, due to the confidence and ability to make things look plausible, it sent me on a wild goose chase. And this is why I am not using LLM for programming. They are basically overconfident junior devs, that likes mansplaining.



  • I know I have used it since Fedora made it default in 2016. I think I actually used it a while before that, but I don’t have any thing to help me pin down the exact time.

    Since I only use Intel built-in GPU, everything have worked pretty well. The few times I needed to share my screen, I had to logout and login to an X session. However, that was solved a couple of years ago. Now, I just wait for Java to get proper Wayland support, so I fully can ditch X for my daily use and get to take advantage of multi DPI capabilities of Wayland.



  • But is the desktop really the most relevant measurement? Wouldn’t it be more relevant to talk about “primary” devices? When I grew up, the desktop was what people used to connect with Internet and everything that comes with that. Hence, Linux on the desktop seemed to be relevant. Now, that is still relevant in relation to work and gaming, but for general use people use other devices. So instead of “on the desktop” I think we should talk about “for work”, “for gaming” and “for programming”.






  • From their documentation

    Unlike classic terminals, Warp requires you to sign up and log in to get started with the app.

    So, yeah, it might be that people are not very impressed by a terminal that requires a cloud account.

    But, if you don’t type anything sensitive on to your terminal, like passwords and such, then you should be fine…