• TommySoda@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    I spent over $4,000 on my PC a few years ago. She isn’t top of the line, but she gets the job done. I shouldn’t have to brute force my way into newer games because the optimization is so bad that my fairly decent PC can’t even get the game up to 60fps when it could play God of War at 100fps+ 2 years ago. I’m not gonna spend thousands of dollars for a new graphics card every couple years just because your game runs like shit. That’s almost as much money as I spend on my car payments.

    Not optimizing games is just lazy game development. Back in the 90’s and 2000’s you HAD to make your game run well on most systems or you wouldn’t be able to sell as much as you needed to. Nowadays computers and consoles can do so much that optimization is unfortunately an afterthought.

    • ninjan@lemmy.mildgrim.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      I’d say the problem isn’t so much optimization as it is scaling. The FPS delta between low and ultra is just stupid small in many games nowadays. Before dropping to low would make the game look like shit sure but it would also run on 5+ year old hardware. Now you get like 10 FPS+ and still slog around under 60 fps on 2-3 year old 6-series cards (X060/X600). Sure some games are CPU bound as well but that’s less common.

      Really what needs to happen is devs need to add a potato mode so we can at least play the game.

      I’ll however say that the source of the problem is of course consoles. On them settings are rather meaningless so it’s only for the PC market you need them and given how many gaming PCs outperform consoles and PC gamers generally expect the PC version to look better it’s no wonder that’s where they put their focus and effort. But a proper low setting that actually scales shouldn’t be too hard to achieve.