this post was submitted on 15 Feb 2024
97 points (83.0% liked)
Games
32368 readers
1209 users here now
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Weekly Threads:
Rules:
-
Submissions have to be related to games
-
No bigotry or harassment, be civil
-
No excessive self-promotion
-
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
-
Mark Spoilers and NSFW
-
No linking to piracy
More information about the community rules can be found here.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I feel like we hear this every single time though. "Largest tech leap in a hardware generation" very much means "we'll bump the graphics a little, we're still targeting 30fps though"
I'd argue this generation actually did deliver performance-wise, most games release with a performance mode that targets 60fps whereas the PS3/PS4 generation felt mostly stuck to 30FPS.
Honestly, that's fair. Maybe I was being a little too harsh, plus this gen did come with more customizable settings (IE, setting to "performance mode" or "fidelity" mode)
And often, the fidelity mode is close to or sometimes native 4k, whoch is impressive for a console. Remember when full 1080p was the push?
Well, yeah, this gen is pretty much last gen but 60fps.
Except when it tries to do fancy UE5 features or raytracing then you get that 30 fps with smeary FSR
Up until recently I think most TVs weren't 60Hz
TVs have been 60Hz as standard for a long long long time. We're talking multiple decades.
Even in the CRT days, almost all TVs (in North America, that is) were rated at 60Hz.
30/60fps is always a developer choice. Not related to hardware capability.
That being said, every generation console makers will make the most powerful hardware they can for the price point they are gonna charge. It's not exactly like Microsoft have any secret sauce here. It's the same amd/nvidia hardware choices for the price point they think they can sell at that anyone can make a machine with.
I don't know why you were being downvoted. It's true. FPS is the developers decision. If a game had like 9 pixels on screen, they could make that game do ultra high framerate.
Developers usually prefer better graphics over framerate however. I just hope that more games allow the choice between graphics, framerate, and a balance between the two... like with Hogwarts Legacy.
It’s funny that you enjoy these settings.
Personally I hate these as I’d just want to play the game the way the developers wanted me to play it. I hate that PC influence.
I guess everyone is different 😅
So you hate that PCs are more capable and can display better graphics at higher framerates and have rationalized it to yourself that worse graphics and framerates on a console are "how the developers intended".
I can understand not wanting to tinker with settings and just load a game up and know what to expect in terms of graphics and framerate, but I just cannot disagree more with what you are saying here. Building games to console limitations and not even giving the option for fidelity or framerate just seems like a step backward.
You have a point.
Personally, I was mostly thinking about these options for games developed mostly for consoles.
It’s true that I hadn’t taken into account that some games are developed for pc and downgraded for consoles.
Still, even for such a game I would want the developers to think about their perfect ratio between fidelity and performance.
Yes, a choice to code and optimize the game properly or not is always the creator's choice.