Search code examples
vue.jshtml5-canvaswebglfabricjs

FabricJS v3.4.0: Filters & maxTextureSize - performance/size limitations


Intro:

I've been messing with fabricJS image filtering features in an attempt to start using them in my webapp, but i've run into the following.

It seems fabricJS by default only sets the image size cap (textureSize) on filters to be 2048, meaning the largest image is 2048x2048 pixels.

I've attempted to raise the default by calling fabric.isWebGLSupported() and then setting fabric.textureSize = fabric.maxTextureSize, but that still caps it at 4096x4096 pixels, even though my maxTextureSize on my device is in the 16000~ range.

I realize that devices usually report the full value without accounting for current memory actually available, but that still seems like a hard limitation.

So I guess the main issues I'm looking at here to start effectively using this feature:

1- Render blocking applyFilters() method:

The current filter application function seems to be render blocking in the browser, is there a way call it without blocking the rendering, so I can show an indeterministic loading spinner or something?

is it as simple as making the apply filter method async and calling it from somewhere else in the app? (I'm using vue for context, with webpack/babel which polyfills async/await etc.)

2- Size limits:

Is there a way to bypass the size limit on images? I'm looking to filter images up to 4800x7200 pixels

I can think of one way atleast to do this, which is to "break up" the image into smaller images, apply the filters, and then stitch it back together. But I worry it might be a performance hit, as there will be a lot of canvas exports & canvas initializations in this process.

I'm surprised fabricjs doesn't do this "chunking" by default as its quite a comprehensive library, and I think they've already gone to the point where they use webGL shaders (which is a black box to me) for filtering under the hood for performance, is there a better way to do this?

My other solution would be to send the image to a service (one i handroll, or a pre-existing paid one) that applies the filters somewhere in the cloud and returns it to the user, but thats not a solution i prefer to resort to just yet.

For context, i'm mostly using fabric.Canvas and fabric.StaticCanvas to initialize canvases in my app.

Any insights/help with this would be great.


Solution

  • i wrote the filtering backend for fabricJS, with Mr. Scott Seaward (credits to him too), and i can give you some answers.

    Hard block to 2048

    A lot of macbook with intel integrated only videocard report a max texture size of 4096, but then they crash the webgl instance at anything higher of 2280. This was happening widely in 2017 when the webgl filtering was written. 4096 would have left uncovered by default a LOT of notebooks. Do not forget mobile phones too. You know your userbase, you can up the limit to what your video card allows and what canvas allows in your browser. The final image, for how big the texture can be, must be copied in a canvas and displayed. ( canvas has a different max size depending on browser and device )

    Render blocking applyFilters() method

    Webgl is sync for what i understood. Creating a parallel executing in a thread for filtering operations that are in the order of 20-30 ms ( sometimes just a couple of ms in chrome ) seems excessive.

    Also consider that i tried it but when more than 4 webgl context were open in firefox, some would have been dropped. So i decided for one at time.

    The non webgl filtering take longer of course, that could be done probably in a separate thread, but fabricJS is a generic library that does both vectors and filterings and serialization, it has already lot of things on the plate, filtering performances are not that bad. But i'm open to argue around it.

    Chunking

    Shutterstock editor uses fabricJS and is the main reason why a webgl backend was written. The editor has also chunking and can filter with tiles of 2048 pixels bigger images. We did not release that as opensource and i do not plan of asking. That kind of tiling limit the kind of filters you can write because the code has knowledge of a limited portion of the image at time, even just blurring becomes complicated.

    Here there is a description of the process of tiling, is written for casual reader and not only software engineers, is just a blog post. https://tech.shutterstock.com/2019/04/30/canvas-webgl-filtering-concepts

    Generic render blocking consideration

    So fabricJS has some pre-written filters made with shaders. The timing i note here are from my memory and not reverified

    The time that pass away filtering an image is:

    • Uploading the image in the GPU ( i do not know how many ms )
    • Compiling the shader ( up to 40 ms, depends )
    • Running the shader ( like 2 ms )
    • Downloading the result on the GPU ( like 0ms or 13 depends on what method is using )

    Now the first time you run a filter on a single image:

    • The image gets uploaded
    • Filter compiled
    • Shader Run
    • Result downloaded

    The second time you do this:

    • Shader Run
    • Result downloaded

    When a new filter is added or filter is changed:

    • New filter compiled
    • Shader or both shader run
    • Result downloaded

    Most common errors in application building with filtering that i have noticed are:

    • You forget to remove old filters, leaving them active with a value near 0 that does not produce visual changes, but adds up time
    • You connect the filter to a slider change event, without throttling, and that depending on the browser/device brings up to 120 filtering operation per second.

    Look at the official simple demo: http://fabricjs.com/image-filters

    Use the sliders to filter, apply even more filters, everything seems pretty smooth to me.