this post was submitted on 06 Dec 2024
68 points (100.0% liked)
Gaming
30606 readers
278 users here now
From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!
Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.
See also Gaming's sister community Tabletop Gaming.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You may as well have typed this in 2009 or 2015.
It used to be that people argued that it's worth getting the new game console because "better graphics". The console wars hasn't gone anywhere, it's just expanded.
In any case, in regards to just installing a game and playing it, no, not really. When I was playing games in college in 2012 it was still a time when you would open a game and go to the settings menu to adjust settings.
Sometimes it was just turning off motion blur, but there was always settings to change to try to reach a stable 60FPS.
Nothing changed, it just expanded. Now instead of 60FPS it's a variable 60-240FPS. Instead of just 720p-1080p resolution, unless it's portable, it's 1080p minimum otherwise variable up to 4k. Instead of "maxing out" we now have raytracing which pushes software further than our hardware is capable.
These aren't bad things, they're just now 1) slightly marketed, 2) more well known in the social sphere. There isn't anything stopping you from opening up the game and going right away, and there's nothing stopping other people from wondering about frame timings and other technical details.
Sure, focusing on the little things like that can take away from the wider experience, but people pursue things for different reasons. When I got Cyberpunk 2077 I knew that there were issues under the hood, but my experience with the game at launch was also pretty much perfect because I was focused on different things. I personally don't think a dip here and there is worth fretting over, but some people it ruins the game for them. Other people just like knowing that they're taking full advantage of their hardware, hence figuring out the utilization of their components.
There's one last aspect not mentioned. Architectures. 10 years ago games would just boot up and run... But what about games from 10 years before then? Most players not on consoles were having to do weird CPU timing shenanigans to be able to boot up a game from (now 20) years ago. We're in the same boat now with emulation, which while emulation is faring better, X360/PS3 generation games that had PC ports are starting to have issues on modern Windows. Even just 5 or 6 years ago games like Sleeping Dogs wouldn't play nice on modern PC's, so there's a whole extra aspect of tinkering on PC that hasn't even been touched on.
All this to say, we are in the same boat we've always been in. The only difference is that social media now has more knowledge about these aspects of gaming so it's being focused on more.
The one thing I do agree with though is that this is all part of software development. Making users need better hardware, intentional or not, is pretty crazy. The fact that consoles themselves now have Quality vs Performance modes is also crazy. But, I will never say no to more options. I actually think it's wrong that the console version of games often are missing settings adjustments, when the PC counterpart has full control. I understand when it's to keep performance at an acceptable level, but it can be annoying.
Always turn off motion blur and DoF if you can.