this post was submitted on 18 Jul 2023
34 points (100.0% liked)
Furry Technologists
1307 readers
1 users here now
Science, Technology, and pawbs
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I haven't found a good encoder, I used all the ones with ffmpeg, such as libsvtav1 and librav1e, but the results are not as good. I screenshot the frame from the original and the transcoded file, then compare that with h265 and the av1 version has noticeable artifacts, like an object will move across the sky, and it will leave a few pixels behind sometimes. I just assume that the problem is with the encoder, and if I can get a hold of a GPU that does encoding, then I can finally start using it. Until then, I'm using vp9 for the short clips I publish, since that works on Discord and Telegram. Now if only Mastodon would support literally anything except h264/aac then I wouldn't have to add a h264 version to the pipeline for everything.
Hardware encoding is worse quality than software encoding iirc (per filesize). libaom should be the best quality codec but it encodes the slowest I believe? Idk the last time I checked on AV1 was like a couple years ago and the encoders are always in flux. You can use vmaf for a deterministic scoring system for quality if you don't want to pixel hunt.
What settings have you been using? I haven't noticed any issues so long as I'm not concerned with real-time encoding. And yeah, GPU encoding is generally worse than software, it's just usually way faster.
EDIT: for reference I've been using speed 6 and an RF between 40 and 30, and even in fast-paced scenes like ones in "Puss in Boots: The Last Wish" I can't notice anything super off. With my real-time recordings best I can do is speed 7 and 6000kbps (maybe higher bitrate), which isn't quite enough for the 1080p60 fast, colorful gameplay of Splatoon 3 - but even then, I'd need a decently higher bitrate with either x264 or x265, especially GPU encoded.
IDK his exact settings, but a fur I follow on Mastodon has been able to get 4K Blu-ray rips encoded to AV1 down to a handful of gigabytes, and he reports no noticeable quality problems at speed 6 RF 30.
The settings depends on what I'm doing. If I'm trying to squeeze a lot of video to a small space, I'll crank up crf. For good quality I was using crf 40 or so. If something is already encoded, simply transcoding will be lossy and be blurrier if you side-by-side them, so for archiving I decided to just always keep the source, even if it's raw DVD or other wasteful codec. But for making clips and things out of it, I'm trying to use the best thing, that doesn't waste a lot of space, and is playable by most people. It was awhile ago that I compared encoders, but I settled on using libsvtav1. librav1e seemed to have even less options, and it didn't seem better, but maybe there is a way to tune it better. What killed it for me was when I had an artifact in an AV1 video that I noticed, and when I encoded the same video with h265 and with same bitrate, it did not have that problem. Eventually one of my subscribers complained they couldn't view my h265 videos on their phone, so I switched to vp9 for more compatibility, and without having to stoop back to the least common denominator codec that is h264. So now even if I figure out the best encoder library and settings, I can't use it right now because of compatibility, I'm not just making videos for me to look at. I'll probably try pushing for AV1 again soon just to see, because it's inevitable that it will be supported everywhere. I'm sad to hear people saying GPU encoding can't be better, I hoped there would be an option to make it take longer but do good job at really great quality per byte. I still dont have a GPU that can do AV1 encoding, and I can't find any articles that show comparisons on the quality of the output, it's always just about how fast it encodes.
For hardware encoding, it will depend on your hardware encoder. Even for h264 etc, the hardware encoders have always been a step down from software encoding. This is a quick chart of Intel's AV1 encoder in comparison to other common software and hardware encoders. It's from this video which iirc was very informative on this topic for the Intel GPUs specifically.
I see they put out at least a couple more videos on AV1 in general here and here if you're interested, but I haven't watched them.
Thank you, that was very informative!