I have a list in a "file.txt" with GPS coordinates, in it I have the format "latitude, longitude". I will try to explain the example or code I would like in Python, language I try learning.
GPS = current position + RADIUS / MARGIN = 0.9 (900 meters)
The current GPS position would be "collected" from the serial in "/dev/ttyS0", using a GPS module connected to Raspberry Pi3 ( Raspbian ).
I need to know if my current position (using RADIUS / MARGIN of 900 meters) is TRUE or FALSE according to the list of coordinates that i have in the "file.txt".
file.txt
-34.61517, -58.38124
-34.61517, -58.38124
-34.61527, -58.38123
-34.61586, -58.38121
-34.61647, -58.38118
-34.61762, -58.38113
-34.61851, -58.38109
-34.61871, -58.38109
-34.61902, -58.38108
-34.61927, -58.38108
-34.61953, -58.38108
-34.61975, -58.38106
-34.61979, -58.38112
-34.6198, -58.38113
-34.61981, -58.38115
-34.61983, -58.38116
-34.61986, -58.38117
-34.61993, -58.38118
-34.62011, -58.38119
-34.62037, -58.38121
-34.62059, -58.38122
-34.62075, -58.38122
-34.6209, -58.38122
-34.62143, -58.38117
-34.62157, -58.38115
-34.62168, -58.38115
-34.6218, -58.38114
-34.62191, -58.38115
-34.62199, -58.38116
-34.62206, -58.38119
-34.62218, -58.38123
-34.62227, -58.38128
-34.62234, -58.38134
-34.62241, -58.3814
-34.62249, -58.38149
-34.62254, -58.38156
-34.62261, -58.38168
-34.62266, -58.38179
-34.62273, -58.38194
-34.62276, -58.38201
-34.62283, -58.38238
-34.62282, -58.38261
-34.62281, -58.38291
-34.62281, -58.38309
-34.62281, -58.38313
-34.62281, -58.3836
-34.62281, -58.38388
-34.62282, -58.38434
-34.62282, -58.38442
-34.62283, -58.3845
-34.62283, -58.38463
-34.62285, -58.38499
-34.62287, -58.3853
-34.6229, -58.38581
-34.62291, -58.38589
-34.62292, -58.38597
-34.62297, -58.38653
-34,623, -58,3868
-34.62303, -58.3871
-34,623, -58,38713
-34.62299, -58.38714
-34.62298, -58.38715
-34.62298, -58.38716
-34.62297, -58.38717
-34.62297, -58.38728
-34.62297, -58.38735
-34.62298, -58.38755
-34.62299, -58.3877
-34.62305, -58.38829
-34.62308, -58.38848
-34.6231, -58.38865
-34.62311, -58.38874
-34.62316, -58.3892
-34.62318, -58.38933
Is this possible in Python? Thanks in advance (:
I don't understand if it's exactly this that you wanted to know. This solution refers to the problem "Given a point and my current position, is my distance from that point minor than a specific value?".
If that's the case and distances are small enough (less than 1 km), you can use the Pythagorean theorem:
distance = c*6371*180/pi*sqrt((currentPosition.lat - targetLat)**2 +
(currentPosition.long - targetLong)**2)
where c is a coefficient you have to find in your zone (in Italy it's 0.8 for example, just divide the real value - you can obtain it with Google Maps - by the result you get setting c at 1), 6371 is the Earth's radius and pi is 3.14159; then you can just compare the distance with the maximum distance you want with
distance < maxDistance
In this case, maxDistance is 0.9 .
Notice that this formula is approximated, but, given the low distances you're dealing with, it can be accurate enough. You should use trigonometry if distances are higher (for example that doesn't make sense if you want to compare two points in two different continents). In that case, this is the formula you should use - the great circle formula:
distance = 6371*acos(sin(lat1)*sin(lat2)+cos(lat1)cos(lat2)cos(long1-long2))
where (lat1,long1) and (lat2,long2) are the spherical cohordinates of the two points you are measuring. Then compare the distance with the maxDistance like the previous expression, and you're done.
If you want to solve the problem for a set of points in a txt file, just read those values and iterate over them using a for loop or a for-each.
Reference https://en.wikipedia.org/wiki/Great-circle_distance for further details.