So, I'm trying to use Pixelate and Gaussian Blur to achieve a "upscaled potato resolution" effect. I want to set a specific amount of pixels in the Pixelate VFX to get a specific "resolution", but I can't figure out the relationship between the slider value and the amount of pixels the filter creates.
I've done some manual testing here: https://www.desmos.com/calculator/kmpp81fgh1 but as you can see my findings have been imprecise and inconclusive. So can someone provide a equation for the amount of pixels in terms of the slider value and original resolution of the source?