this post was submitted on 24 Sep 2023
968 points (97.5% liked)

Dank Memes

6169 readers
2 users here now

This is the place to be on the interweb when Reddit irreversibly becomes a meme itself and implodes

If you are existing mods from r/dankmemes, you should be mod here too, kindly DM me on either platform

The many rules inherited from

  1. Be nice, don't be not nice
  2. No Bigotry or Bullying
  3. Don't be a dick!
  4. Censor any and all personal information from posts and comments
  5. No spam, outside links, or videos.
  6. No Metabaiting
  7. No brigading
  8. Keep it dank!
  9. Mark NSFW and spoilers appropriately
  10. NO REEEEEEE-POSTS!
  11. No shitposting
  12. Format your meme correctly. No posts where the title is the meme caption!
  13. No agenda posting!
  14. Don't be a critic
  15. Karma threshold? What's that?

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago

It depends how they store the models, if they're using normal mapping (which they probably are) they will need to store the following in a file: Position (x,y,z) normal (x,y,z) texturecoordinates (u,v) tangent (x,y,z) bitangent (x,y,z). For each vertex, assuming that they're using a custom binary format and 32-bit (4 byte) floats, 56 bytes per vertex. The Sponza model which is commonly used for testing has around 1.9 million vertices: in our hypothetical format at least, 106.4MB for the vertices. But we also have to store the indices which are a optimisation to prevent the repetition on common vertices. Sponza has 3.9 million triangles, 3 32-bit integers per triangle gets us an additional 46.8 MB. So using that naeive format which should be extremely fast to load and alot of models, 3D model data is no insignificant contributor to file size.