I have a problem where I need to interpolate a 3D function using e.g. SciPy, and then save the output of this interpolation for future use. That is, I don't want to have to run the interpolation procedure every time as generating the 3D function to be interpolated is computationally demanding (it is from the Biot-Savart law so is a lot of numerical integrations).
However, I'm having trouble understanding if this is possible and, secondly, how to implement this. From what I've seen on some other posts, it should be possible, but the solutions don't seem to work for me.
I have written the following test code, but I receive the below error when testing it:
TypeError: 'numpy.ndarray' object is not callable
This error is when is on the code line starting zeroVal
in the function loadInterpolation()
. I was hoping the allow_pickle=True
would have solved this, based on what I read previously
.
import scipy
from scipy.interpolate import RegularGridInterpolator
def f(x,y,z):
field = -x**2-y**2+z**2
return field
def performSaveInterpolation():
print(scipy.__version__)
print('Performing Interpolation...')
x = np.linspace(-1,1,100)
y = np.linspace(-1,1,100)
z = np.linspace(-1,1,100)
xg, yg ,zg = np.meshgrid(x, y, z, indexing='ij', sparse=True)
data = f(xg,yg,zg)
my_interpolating_function = RegularGridInterpolator((x, y, z), data)
zeroVal = my_interpolating_function([0,0,0])
oneVal = my_interpolating_function([1,1,1])
print('Interpolated function @ (0,0,0): ' + str(zeroVal))
print('Interpolated function @ (1,1,1): ' + str(oneVal))
np.save('interpolation.npy',my_interpolating_function,allow_pickle=True)
return 0
def loadInterpolation():
print('Loading Interpolation...')
interpolationFunc = np.load('interpolation.npy',allow_pickle=True)
zeroVal = interpolationFunc([0,0,0])
oneVal = interpolationFunc([1,1,1])
print('Interpolated function @ (0,0,0): ' + str(zeroVal))
print('Interpolated function @ (1,1,1): ' + str(oneVal))
return 0
performSaveInterpolation()
loadInterpolation()
Why not using pickle directly ? Linked to the question : How-can-i-use-pickle-to-save-a-dict.
Pickle is supposed to be able to serialize any possible type of python object. Not only numpy arrays (wich numpy save function is only able to do. It cannot save other objects types than np.arrays, and an iterpolation function is not an numpy.ndarray
type, hence the errror)
Use it like this :
import pickle
with open('mypicklefile.pck', 'wb') as file_handle:
pickle.dump(my_saved_object, file_handle)
then :
with open('mypicklefile.pck', 'rb') as file_handle:
my_loaded_object = pickle.load(file_handle)
If you ever need to serialize multiple objects at once, for example, your function, and zeroVal
as well as oneVal
to avoid extracting them from the function every time, you could either :
Serialize multiple times on the same file :
with open('mypicklefile.pck', 'wb') as file_handle:
pickle.dump(my_saved_object, file_handle)
pickle.dump(my_saved_object2, file_handle)
pickle.dump(my_saved_object3, file_handle)
And then deserialize as many times (order matters ! You have to "remember" which object is which)
with open('mypicklefile.pck', 'rb') as file_handle:
my_loaded_object = pickle.load(file_handle)
my_loaded_object2 = pickle.load(file_handle)
my_loaded_object3 = pickle.load(file_handle)
Or, (better solution IMO) use a dictionnary, to have a tiny bit of metadata with each object, in that it has a name (the key) :
mydata = {"tiny_description": my_saved_object, "something_more" : my_saved_object2}
with open('mypicklefile.pck', 'wb') as file_handle:
pickle.dump(mydata , file_handle)
with open('mypicklefile.pck', 'rb') as file_handle:
temp_dict = pickle.load(file_handle)
tiny_description = temp_dict["tiny_description"]
something_more = emp_dict["something_more"]