The length of an array I pass as ref from C# to a C++ library function returns with length of 1 instead of its actually length when run on Android.
The code works fine when written for windows, but not for Android.
FYI, this is a Unity project and I'm using OpenCV.
I have the following function in the library.
extern "C" void ApplyCannyEffect(Color32 **rawImage, int width, int height)
{
Mat image(height, width, CV_8UC4, *rawImage);
flip(image, image, -1);
Mat edges;
Canny(image, edges, 50, 200);
dilate(edges, edges, (5, 5));
cvtColor(edges, edges, COLOR_GRAY2RGBA);
normalize(edges, edges, 0, 1, NORM_MINMAX);
multiply(image, edges, image);
flip(image, image, 0);
}
Color32 is defined as
struct Color32
{
uchar red;
uchar green;
uchar blue;
uchar alpha;
};
In my C# code I declare this function by:
[DllImport("test")]
internal static extern void ApplyCannyEffect(ref Color32[] rawImage, int width, int height);
In C# I get the image data from the texture. Unity's GetPixels32 function returns an array of structs for each pixel starting with the pixel in the bottom left (Unity's convention, not OpenCV's). The structs are four bytes, one each for red, green, blue, and alpha values of each pixel. I then check the length of the array, and it is the expected size (width in pixels * height in pixels). I call the C++ function. I then check the length of the array and it is 1. Needless to say, I am not working with an image that is 1 pixel in size.
Color32[] rawImage = texture.GetPixels32();
Debug.Log(rawImage.Length.ToString()); //output is as expected on both Windows and Android
ApplyCannyEffect(ref rawImage, texture.width, texture.height);
Debug.Log(rawImage.Length.ToString()); //output is same as above on Windows but "1" on Android
It works on Windows as expected, but not on Android. And the relevant code is unchanged between my Windows code and my Android code.
What causes this?
I am not too familiar with C++. But I did not expect to need to in C++ have Color32 parameter passed as a pointer to a pointer (I got this example code from a tutorial). Is this because I am passing an array of structures? By using two reference operators on an array of structs does it end up as a pointer to the first uchar of the first struct element of the array? Is this the reason it becomes length 1 after the function call?
To test this I tried converting the array of Color32 Structs to a byte array that was 4 times as long as the Color32 array in C# (the r, g, b, and a data sequentially filling the byte array)
private byte[] Color32toByteArray(Color32[] rawImage)
{
int outputIndex = 0;
byte[] output = new byte[rawImage.Length * 4];
for (int c = 0; c < rawImage.Length; c++)
{
output[outputIndex] = rawImage[c].r;
outputIndex++;
output[outputIndex] = rawImage[c].g;
outputIndex++;
output[outputIndex] = rawImage[c].b;
outputIndex++;
output[outputIndex] = rawImage[c].a;
outputIndex++;
}
Debug.Log("rawImage byte array length " + output.Length.ToString());
return output;
}
then passing the output to my Windows C++ dll's function that I modified to:
extern "C" __declspec(dllexport) void ApplyCannyEffect_Byte(unsigned char* rawImage, int width, int height)
{
using namespace cv;
// create an opencv object sharing the same data space
Mat image(height, width, CV_8UC4, rawImage);
// start with flip (in both directions) if your image looks inverted
flip(image, image, -1);
Mat edges;
Canny(image, edges, 50, 200);
dilate(edges, edges, (5, 5));
cvtColor(edges, edges, COLOR_GRAY2RGBA);
normalize(edges, edges, 0, 1, NORM_MINMAX);
multiply(image, edges, image);
// flip again (just vertically) to get the right orientation
flip(image, image, 0);
}
By the following code
Color32[] rawImage = texture.GetPixels32();
int length = rawImage.Length;
byte[] rawByteImage = Color32toByteArray(rawImage);
ApplyCannyEffect_Byte(ref rawByteImage, texture.width, texture.height);
rawImage = BytetoColor32Array(rawByteImage, texture.width, texture.height);
texture.SetPixels32(rawImage);
texture.Apply();
Debug.Log((rawByteImage.Length / 4).ToString() + " " + rawImage.Length.ToString());
but this also (on Windows) just crashes the program at the ApplyCannyEffect_Byte call (commenting that line out works as expected, the output for the lengths of the arrays is correct).
Why does the first set of code work on Windows, but not Android?
This may be a packing issue. Consider using Unity's Color32 struct, which is perfectly aligned for use in native code.
Also you can't pass managed array as ref (because ref may also add internal info, such as array length before actual data, which become overwritten by DLL code), for this call you should use
Color32[] rawImage = texture.GetPixels32();
var ptr = Marshal.UnsafeAddrOfPinnedArrayElement(rawImage, 0);
ApplyCannyEffect_Byte(ptr, texture.width, texture.height);
texture.SetPixels32(rawImage);
texture.Apply();
...
// Modified function import:
[DllImport("test")]
internal static extern void ApplyCannyEffect(IntPtr rawImage, int width, int height);
NOTE: there is no conversion, cause it's not actually needed and drops performance a lot, also creates useless GC allocations. With native function you can write directly to Color32 array.
NOTE2: For fully GC-free implementation consider using NativeArray Texture2D.GetPixelData() method, you can get IntPtr from NativeArray, and dispose it after SetPixels32 or preallocate Color32[] array and pin it with GCHandle (Not really convenient solution).
No need to change DLL function signature.