I am plotting points using ogr in Python (Python 3!), and it works fine if I input my coordinates into the code as numbers, but the following code fails if I try and use the coordinates from a csv file.
Working code:
from osgeo import ogr
point = ogr.Geometry(ogr.wkbPoint)
point.AddPoint(40.729047, -73.957472)
print(point.ExportToWkt())
Code that breaks:
from osgeo import ogr
datatest = pd.read_csv('data.csv')
p1_lat, p1_lon = datatest['POINT1_LAT'], datatest['POINT1_LON']
point = ogr.Geometry(ogr.wkbPoint)
point.AddPoint(p1_lat, p1_lon)
print(point.ExportToWkt())
That second example fails with the error TypeError: in method 'Geometry_AddPoint', argument 2 of type 'double'
What am I doing wrong and how can I fix it so that the code can call coordinates from my csv file? Thanks.
EDIT: as requested, if I print the two variables I get:
0 40.729047
Name: POINT1_LAT, dtype: float64
0 -73.957472
Name: POINT1_LON, dtype: float64
EDIT2: This is Python 3
You are adding pandas dataframes with AddPoint instead of floats.
import pandas as pd
pts = pd.DataFrame({'p': [3.4532], 'q' : [5.674]})
print(pts['p'])
print(pts['q'])
print(float(pts['p'][0]))
print(float(pts['q'][0]))
0 3.4532
Name: p, dtype: float64
0 5.674
Name: q, dtype: float64
3.4532
5.674
This should work: -
from osgeo import ogr
datatest = pd.read_csv('data.csv')
p1_lat, p1_lon = datatest['POINT1_LAT'], datatest['POINT1_LON']
p1_lat = float(p1_lat[0])
p1_lon = float(p1_lon[0])
point = ogr.Geometry(ogr.wkbPoint)
point.AddPoint(p1_lat, p1_lon)
print(point.ExportToWkt())