I have a very creative requirement - I am not sure if this is feasible - but it would certainly spice up my app if it could .
Premise: On Android phones, if the screen is covered by hand(not touching, just close to the screen) or if the phone is placed over the ear during a call the phone locks or basically it blacks out. So there must be some tech to recognize that my hand is near the screen.
Problem: I have an image in my app. If the user points to the image without touching the screen, just as an extension to the premise, I must be able to know that the user is pointing to the image and change the image. Is this possible ?
UPDATE: An example use:
Say I want to build a fun app, on touch the image leads to some other place. For example - I have two doors one to a car and one to a lion. Now just when the user is about to touch door 1 - the door should show a message saying are you sure, and then actually touching it takes you to another place. Kinda rudimentary example, but I hope you get the point
The feature you are talking about is the proximity sensor. See Sensor
and SensorEvent.values
for Sensor.TYPE_PROXIMITY
.
You could get the distance of the hand from the screen, but you won't really be sure where in the XY co-ordinate system the hand is. So you won't be able to figure out whether the user is pointing to the "car door" or to the "lion door".