I am trying to find a way to specify some antialiasing settings in three.js/WebGL to try and improve the results.
The thing is that with the exact same code, if I load a model on a Retina Display, the antialiasing works quite fine (even if I move it to my non-retina external monitor afterwards), but it's all pixelated if I load it first on a non-retina screen.
Here is a screenshot (both on Chrome, both displayed on a retina display). Left was loaded on a non-retina, right on a retina: https://i.sstatic.net/EJ0mS.png
What I get from this is that three.js somehow uses the pixel density when initializing the antialiasing. Is there anyway to tweak this so that I can force it to something better?
Thanks a lot in advance for your help :)
Side note: For the record, it seems that the antialiasing works much better on Firefox as well, anyone knows why?
Just in case someone is looking to do the same kind of tweaking I was trying, I'll answer my own question.
Based on WaclawJasper's comment, I found some documentation in Three.js related to my issue. From http://threejs.org/docs/#Reference/Renderers/WebGLRenderer:
.setPixelRatio ( value )
Sets device pixel ratio. This is usually used for HiDPI device to prevent blurring output canvas.
I was using renderer.setPixelRatio( window.devicePixelRatio );
in my initialization, which was the reason why the rendering was depending on where the page was first loaded.
Related to the antialiasing, I now artificially raise the pixel density using renderer.setPixelRatio(2);
on non-retina screens. This results in a much more effective antialiasing, at the cost of increased computation.