Search code examples
swiftgpuimagefragment-shaderimagefilter

Implementing custom filters using GPUImage Swift library


This probably is a dumb question, but I am stuck with this for a while so going to ask it anyway.

I'm trying to implement a Hudson/Nashville filter on a pet project. I googled a little and checkout out a few open-source projects and found some Objective-C (which I don't understand) based projects. They do have the filters implemented using GPUImage2, but I wasn't sure about their approach.

I have the overlay and other images that they have used and the GLSL files.

So my question is how do I go about using this images and shader files to implement a custom filter?

Note: I tried using the LookupFilter approach as suggested, but the result wasn't so good. It would be super helpful if you can show me some code. Thanks

Update:

What I am trying to understand this. Given a custom shader like the one below, how am I supposed to pass the input images for uniform inputImageTexture2, inputImageTexture3 & inputImageTexture4. Do I pass it as a PictureInput to BasicOperation by subclassing it? If so, how? What am I missing? I haven't been able to walk through the code much because of the lack of a proper documentation. I have read up on shaders and its different components now, but still not able to figure out a way to work with custom filters on GPUImage2. Please help.

precision highp float;

varying highp vec2 textureCoordinate;

uniform sampler2D inputImageTexture;
uniform sampler2D inputImageTexture2; //blowout;
uniform sampler2D inputImageTexture3; //overlay;
uniform sampler2D inputImageTexture4; //map

uniform float strength;

void main()
{
    vec4 originColor = texture2D(inputImageTexture, textureCoordinate);

    vec4 texel = texture2D(inputImageTexture, textureCoordinate);

    vec3 bbTexel = texture2D(inputImageTexture2, textureCoordinate).rgb;

    texel.r = texture2D(inputImageTexture3, vec2(bbTexel.r, texel.r)).r;
    texel.g = texture2D(inputImageTexture3, vec2(bbTexel.g, texel.g)).g;
    texel.b = texture2D(inputImageTexture3, vec2(bbTexel.b, texel.b)).b;

    vec4 mapped;
    mapped.r = texture2D(inputImageTexture4, vec2(texel.r, .16666)).r;
    mapped.g = texture2D(inputImageTexture4, vec2(texel.g, .5)).g;
    mapped.b = texture2D(inputImageTexture4, vec2(texel.b, .83333)).b;
    mapped.a = 1.0;

    mapped.rgb = mix(originColor.rgb, mapped.rgb, strength);

    gl_FragColor = mapped;
}

Solution

  • The GPUImage convention is that the first input texture to a shader is called inputTextureCoordinate, the second inputTextureCoordinate2, and so on. In the original Objective-C version of GPUImage, you can to manually subclass a filter type that matched the number of input textures in your shader.

    In the Swift GPUImage 2, I've made it so that you just need to use a BasicOperation class or subclass, and it automatically attaches textures to the number of inputs needed for your shader. You do this by initializing a BasicOperation and setting the number of inputs:

    let myOperation = BasicOperation(fragmentShaderFile:myFragmentShader, numberOfInputs:4)
    

    The above sets numberOfInputs to 4, matching your above shader. By leaving the vertexShaderFile argument as nil (the default), the BasicOperation will pick an appropriate simple vertex shader with four texture inputs.

    All that you need to do then is set your inputs to that filter like you would any other, by adding your new BasicOperation as a target of an image source. The order in which you attach the inputs matters, because that will start with the first texture in your shader and progress on down.

    In most cases, BasicOperation is flexible enough as-is, so you won't need to subclass. At most, you might need to provide a custom vertex shader, but that's not needed for the fragment shader code above.