• 15 Posts
  • 207 Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle

  • Can’t you accept that someone who knows what they’re taking about might have a different opinion than you? Bluetooth bitrate is once again a non issue for most situations. Unless you’re listening to lossless audio (e: or the headphones are stuck in headset mode) Bluetooth has a higher bitrate than what you’re listening to. And I’d argue with most headphones you hit the limits of the hardware way before you hit any bitrate limitations still. (Edit: what I meant is, if the hardware is capable of delivering better sound quality than what standard codecs can support the manufacturer will then include higher quality codecs)

    I didn’t know streaming services didn’t have audio latency settings, that doesn’t sound ideal. Latency is very situational in how much it matters to different people with different content (game streaming is a thing) so I’d still not write bluetooth off, but if it does bother you do use wired headphones




  • Either you never used a good wireless headphone or a bad wired one. Sure the best wired headphones might have higher quality than the best wireless ones but that’s once again not something everyone will have lying around. In my personal experience every set of wired earbuds / headphones I’ve used (stuff my parents had lying around and ones bought for / gifted to me) sounded worse than all but one pair of wireless ones I’ve used.

    Latency does not matter with audio and can be compensated for with video. Only place it would matter is gaming and even some of those might offer compensation options. Not to dismiss that it might be the decisive factor for some people but it hardly applies to everyone.








  • The latency numbers of displays ie the 8-9 or 40ms include any framebuffer the display might or might not have. If it is less than the frame time it is safe to assume it’s not buffering whole frames before displaying them.

    Your GPU has a frame buffer that’s essentially never less than one frame, and often more.

    And sometimes less, like when vsync is disabled.

    That’s not to say the game is rendered in from top left to bottom right as it is displayed, but since the render time has to fit within the frame time one can be certain that its render started one frame time before the render finished, and it is displayed on the next vsync (if vsync is enabled). That’s 22 ms for 45 fps, another 16 ms for worst case vsync miss and 10 ms for the display latency makes it 48 ms. Majora’ mask at 20 fps would have 50ms render + 8ms display = 58 ms of latency, assuming it too doesn’t miss vsync



  • First time hearing that about OLEDs, can you elaborate? Is it that the lack of inherent motion blur makes it look choppy? As far as I can tell that’s a selling point that even some non-oled displays emulate with backlight strobing, not something displays try to get rid of.

    Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts

    Edit: to be clear, this is the screen’s refresh rate, the game doesn’t need to run at hfr to benefit.





  • Lojcs@lemm.eetoGames@lemmy.worldThe Witcher 4 | Gameplay Tech Demo
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    5
    ·
    edit-2
    1 month ago

    Surely digital foundry will find improvements but to me this doesn’t look that different than witcher 3. Better animations and hair looks better I think? Never noticed lod transitions myself. The promise of increaaed interactivity means nothing in a trailer neither. I only hope it has better multi core performance especially with ray tracing