Below figure with orange area where baby palm will be placed. All I need to know how to detect baby plam is right or left.
Simply put; This cannot be done (accurately).
The screen of iOS devices can detect touch events and the coordinates of where these touch events took place on the screen. However the device have no way of knowing what was placed on the screen. For all the device cares it could be a face planted on the screen as much as a hand.
A possible workaround for this could be to ask the user to place five fingers on the screen and then based on the coordinates of the touch events make a conclusion that it was a left or right hand. Pseudo- code example:
if(*the left most coordinate is lower down on the screen than the rest*) {
% It is probably a right hand since the thumb is
% (on most humans at least) lower down than the rest
}
The same approach could be applied to detect the left hand. However, this is just a ploy way to detect which hand is which. To do this more accurately I would suggest you let the user take a photo of his/her's hand and analyze the image instead.
Hope it helps!