I'm converting a Shadertoy to a local Three.js project, and can't get it to render. You can try out the full snippet here.
I think the problem may lie in how I'm converting the iResolution
variable. As I understand it, the built-in Shadertoy global variable iResolution
contains the pixel dimensions of the window. Here is how iResolution
is used in the original Shadertoy:
vec2 uv = fragCoord.xy / iResolution.y;
vec2 ak = abs(fragCoord.xy / iResolution.xy-0.5);
In converting this Shadertoy into a local Three.js-based script I have tried two approaches to converting iResolution
:
1) Loading the window dimensions as a Vector2 and sending them into the shader as the uniform vec2 uResolution
:
vec2 uv = gl_FragCoord.xy / uResolution.y;
vec2 ak = abs(gl_FragCoord.xy / uResolution.xy-0.5);
This solution sticks closest to the design of the original Shadertoy, but alas nothing renders.
2) The second approach comes from this SO Answer and converts the uv
coordinates to xy
absolute coordinates:
vec2 uvCustom = -1.0 + 2.0 *vUv;
vec2 ak = abs(gl_FragCoord.xy / uvCustom.xy-0.5);
In this one, I admit I don't fully understand how it works, and my use of the uvCustom
in the second line may not be correct.
In the end, nothing renders onscreen except a Three.js CameraHelper I'm using. Otherwise, the screen is black and the console shows no errors for the Javascript or WebGL. Thanks for taking a look!
For starters, you don't need to even do this division. If you are using a full screen quad (PlaneBufferGeometry
), you can render it with just the uvs:
vec2 uv = gl_FragCoord.xy / uResolution.y;
vec2 vUv = varyingUV;
uv == vUv; //sort of
your vertex shader can look something like this
varying vec2 varyingUV;
void main(){
varyingUV = uv;
gl_Position = vec4( position.xy , 0. , 1.);
}
If you make a new THREE.PlaneGeometry(2,2,1,1);
this should render as a full screen quad