Topic: Quality differences in GPU vs CPU encoding?

I'm using InterFrame with FFmpeg (which uses the SVPFlow scripts/dlls) and I'm noticing that changing the GPU flag between 0 and 1 changes my output file.  (all other settings are identical)

If everything else is exactly the same, what is the quality difference when using GPU versus CPU frame encoding?

Re: Quality differences in GPU vs CPU encoding?

More precise rendering (floating point vs. integer math), correct rendering on frame edges, rendering in "linear light", bicubic subsampling.

Re: Quality differences in GPU vs CPU encoding?

Are those advantages of CPU rendering or advantages of GPU rendering?  (i.e. which has better quality?)

Re: Quality differences in GPU vs CPU encoding?

hmm  roll
GPU

Re: Quality differences in GPU vs CPU encoding?

Why the roll-eyes? smile

I've read that CPU encoding of x264 is far superior (but takes much longer) than hardware accelerated encoding.  I thought SVP might be the same.

Re: Quality differences in GPU vs CPU encoding?

in fact CPU and GPU rendering was pixel-to-pixel identical in SVP 3.0.x (Interframe 1)

now with all that FP math and bicubic sampling GPU's result is more accurate and smooth BUT a little bit blurry

Re: Quality differences in GPU vs CPU encoding?

Thanks for the information!