I'm currently trying to implement a simple GUI in GUIslice for a TFT using the ILI9341 display driver and the XPT2046 touch controller, and I have come across a problem.
In my module, for whatever reason, the Y coordinate of the touch is flipped in respect to the display coordinate. That is to say, the touch event lands on a given button if I touch on the X coordinate the button occupies, but as far from the top as the button is from the bottom.
There seems to be nothing in the regular config that lets me define some variable or other to communicate to GUIslice to flip the touch Y coordinate.
I have however found a couple of macros in GUIslice.h which seem relevant:
/// Additional definitions for Touch Handling
/// These macros define the transforms used in remapping the touchscreen
/// inputs on the basis of the GUI nRotation setting.
#define TOUCH_ROTATION_DATA 0x6350
#define TOUCH_ROTATION_SWAPXY(rotation) ((( TOUCH_ROTATION_DATA >> ((rotation&0x03)*4) ) >> 2 ) & 0x01 )
#define TOUCH_ROTATION_FLIPX(rotation) ((( TOUCH_ROTATION_DATA >> ((rotation&0x03)*4) ) >> 1 ) & 0x01 )
#define TOUCH_ROTATION_FLIPY(rotation) ((( TOUCH_ROTATION_DATA >> ((rotation&0x03)*4) ) >> 0 ) & 0x01 )
I have however not been able to figure those out. Am I even supposed to use those, if so, how? If not, is there another way to solve that problem?
GUIslice has two places where rotations are set:
#define GSLC_ROTATE 3
for instance andgslc_GuiRotate(&m_gui, 3);
, which calls the driver's rotation function and is usually generated into the InitGUI...() function by the builder.The preprocessor definition is evaluated in the rotation routine of the chosen driver, which also utilizes the callibration settings and uses the TOUCH_ROTATION_... macros to set up the correct relationship between the display and the touch screen:
bool gslc_DrvRotate(gslc_tsGui* pGui, uint8_t nRotation)
{
bool bChange = true;
bool bSupportRotation = true;
// Determine if the new orientation has swapped axes
// versus the native orientation (0)
bool bSwap = false;
if ((nRotation == 1) || (nRotation == 3)) {
bSwap = true;
}
[...]
// Now update the touch remapping
#if !defined(DRV_TOUCH_NONE)
// Correct touch mapping according to current rotation mode
pGui->nSwapXY = TOUCH_ROTATION_SWAPXY(pGui->nRotation);
pGui->nFlipX = TOUCH_ROTATION_FLIPX(pGui->nRotation);
pGui->nFlipY = TOUCH_ROTATION_FLIPY(pGui->nRotation);
#endif // !DRV_TOUCH_NONE
[...]
}
One might say that this mechanism sets up the initial or base rotation.
Setting the preprocessor directive alone does NOT however actually cause the screen to be rotated.
This is where the second mechanism comes into play. It actually rotates the output by calling the driver's rotation mechanism outlined above, but does not cause the groundwork to happen that aligns and arranges everything correctly unless the preprocessor directive is set correctly:
bool gslc_GuiRotate(gslc_tsGui* pGui, uint8_t nRotation)
{
// Simple wrapper for driver-specific rotation
bool bOk = gslc_DrvRotate(pGui,nRotation);
// Invalidate the new screen dimensions
gslc_InvalidateRgnScreen(pGui);
return bOk;
}
In other words, without the latter, the former would do the correct thing, but doesn't actually do anything, and without the former, the latter does something, but can't do the correct thing.
Because of this interdependence, both mechanisms must be in place and in sync for the rotation to happen correctly and touch events to align with the displayed gui elements.