this post was submitted on 12 Dec 2023
688 points (83.3% liked)

Technology

59197 readers
3404 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 11 points 11 months ago (3 children)

Anyone have a good explanation on 'Frame Time'? This is the first time I've heard of this term and after some quick googling I feel like I'm not understanding why it's worth caring about.

[–] [email protected] 19 points 11 months ago

It's how long it takes the system to render the next frame. High frame times are no good. Equates to lower average fps, and poor player experience. You also want stable frame times. This equates to smooth gameplay and less "stuttering". Anything under 20ms is considered good. 10ms and less is great. Anything over 50ms will be perceived by the player in a negative way.

[–] [email protected] 4 points 11 months ago

I interpret it as the time taken to render a frame. Unlike FPS which is basically a moving average (or rather 1 divided by the average frame time), frame time is a single data point. Collecting frame times allows you to do things like compute the median or, in this case, the lowest 1% of the frame times. That can give you a better idea of how smooth performance appears to the player, and what the worst-case performance is like.

[–] [email protected] 4 points 11 months ago

I'm not surprised at the confusion, because they're using it... not wrong, but very confusingly.

Frame time is literally the time to render a frame. So you'd expect that to be a number of miliseconds per frame and so for lower to be better.

But they're not looking at frametimes, they're looking at 1% lows and expressing that in fps, not in frametimes. So yeah, confusing.

For the record, the reson why the term is becoming popular is that there are now widespread visualizations that will give you a line of your frametimes in a graph so you can see if the line is flat or spiky. You've probably seen it on the Steam Deck or performance analysis videos or whatever. The idea is that all frametimes being consistent is better than high fps but low 1% or 0.1% low. So stable 60fps can look better than spiky 90fps and so on.