I am converting a 512 x 512 32 bit rgb png with encoded height values to a 16 bit grayscale png depicting height values for a height map.
This is the conversion code
Image converted used image-js
image = 32 bit rgb png image arraybuffer
image.getPixelsArray() returns an array of pixels with r, g, b channels
async function getHeightArrayStats(image) {
let decodedHeightArray = []
let stats = {}
stats.minElevation = Number.MAX_VALUE;
stats.maxElevation = Number.MIN_VALUE;
stats.height = image.height;
stats.width = image.width;
let pixelsArray = image.getPixelsArray()
for (const pixel of pixelsArray) {
let r = pixel[0]
let g = pixel[1]
let b = pixel[2]
//Conversion function from mapbox to convert pixel values to height values in meters
// height = -10000 + ((R * 256 * 256 + G * 256 + B) * 0.1)
let height = getHeightFromRgb(r, g, b)
if (height > stats.maxElevation) {
stats.maxElevation = height;
}
if (height < stats.minElevation) {
stats.minElevation = height;
}
decodedHeightArray.push(height)
}
console.log(decodedHeightArray)
return {decodedHeightArray, stats}
}
I then convert that image to 16bit grayscale png using the same width and height as original image
async function convertImage(width, height, decodedHeightArray) {
let newImage = new Image(width, height, decodedHeightArray, {kind: 'GREY', 16})//kind = color and bit depth
return newImage
}
Next I resize the 16bit 512 x 512 to 2017 x 2017 using image-js and BICUBIC interpolation
const newImage = rgbImage.resize({
interpolationType: "BICUBIC", // or "BILINEAR" or "NEAREST", in order of decreasing quality
width: 2017,
preserveAspectRatio: true // Or height: number
});
After the conversion I create a new Landscape size of 2017 in Unreal Engine using the newly created heightmap image. Unreal Engine also provides a formula to convert the terrain scale/Z scale so the terrain will look like it should.
I calculate the Z-scale in this function
getUnrealZScale(maxElevation) {
let cm = (maxElevation * 100)
let zscale = (cm * 0.001953125)
return zscale
},
The maxElevation is calcualted from the getHeightArrayStats function above using the original 32bit rgb image
The problem I am having is that I am getting a stepping result when importing the resized 2017 16bit image into Unreal Engine.
Attached are the input and output images as well as pictures of landscape in Unreal using these images
32 bit 512 x 512 Rgb height value encode terrain png from mapbox
Here is the converted 16bit downsized from 512k to 505k per Unreal Landscape size specs. Zoom in to see the black and white height. I also added a Blur filter to soften the edges. img.blurFilter() from image-js
Here is the image of the Unreal Landscape after importing the above image, no Z-scale applied since the image was only slightly downsized. I think this is the correct way to do it. This looks decent and is about what I expected. Next is the upscale where my problem is.
16 bit heightmap scaled to 2017x2017
https://i.sstatic.net/UuAq3.jpg
Result in Unreal with no Z-scale applied you can see the stepping.
Do I need to increase the pixels per inch? I thought that was what the interpolation was doing. Can I just calculate it manually when I process my decodedHeightArray? If so how?
Also a big thanks to @traktor for help with the conversion code.
Increasing the size from 512 x 512 to 2017 x 2017 results in almost the same pixel depth expanded to 4 pixels. Hence the stepping.
By doing the 32 to 16 bit first you are losing the precision needed for the expansion.
Expand the image at 32 bit image to the final size, 32 bpp.
Blur the image to provide the gradient.
Downsample the resulting image to 16 bits.