• simple@lemm.ee
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    1
    ·
    1 year ago

    We’ve been warned, I expected performance to be rough but ~35fps on a 4090 is a new low for me.

      • 30p87@feddit.de
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        And then there’s everything not triple A, which is 99% terrible but 1% gold.

    • Rhaedas@kbin.social
      link
      fedilink
      arrow-up
      14
      arrow-down
      6
      ·
      edit-2
      1 year ago

      I’ve played some action games in the teens and was fine with it. Maybe lower frame rate at low resolution (1080) isn’t as apparent as the high 4K, but I’ve never understood why people can’t play with frame rates still far faster than film (if it’s truly refreshing the frames completely and not ripping the picture of course). I suppose this argument goes the same direction as the vinyl/CD one, with both opinions dead sure they’re right.

      If the game is handling variations of frame rates during play badly, that’s a different story. The goal is for the player to not realize there’s a change and stay focused on the game.

      • Klear@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I started out playing Doom on a 386, in a tiny tiny viewport, and until recently my hardware was apways behind the curve. I remember playing Oblivion at 640 x 380. And enjoying foggy weather in San Andreas because the reduced draw distance made my fps a lot better.

        Over the years I’ve trained my brain to do amazing real time upsacaling, anti-aliasing, hell, even frame generation. nVidia has nothing on the neural network in my head.

        But not everyone has this experience and smooth FPS is always better, even if I can handle teerible performance if the game is any good.

      • 9bananas@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        simple explanation: people get used to their monitors’ frame rate.

        if all you’ve been using is a 60Hz display, you won’t notice a difference down to 30-40 fps as much as you would when you’ve been using a 144Hz display.

        our brains notice differences much more easily than absolutes, so a larger difference in refresh rate produces a more negative experience.

        think about it like this:

        The refresh rate influences your cursor movements.

        so if a game runs slower than you’re used to, you’ll miss more of your clicks, and you’ll need to compensate by slowing down your movements until you get used to the new refresh rate.

        this effect becomes very obvious at very low fps (>20fps). it’s when people start doing super slow movements.

        same thing happens when you go from 144Hz down to, say, 40Hz.

        that’s an immediately noticeable difference!

    • MudMan@kbin.social
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      Some of the settings are messed up, I think. It definitely can run faster than that by toning down some settings on that hardware. They really should have changed the defaults or straight up removed some visual settings, given what they do to the game. In my experience, the volumetric clouds, reflections and GI presets are all messed up and cost a disproportionate amount of performance when maxed out.