I am trying to fit a (double) Gaussian function to a two-dimensional data set with scipy.optimize curve_fit. Here is the code
df = np.loadtxt(fname=r"...mydat.dat")
X = np.asarray( np.linspace(0, 448, 449) )
Y = np.asarray( np.linspace(0, 448, 449) )
xdata = (np.asarray( np.meshgrid(X, Y) ) ).reshape(2, 201601) # reformatting data for curve_fit
def Gauss(x, A, B, C, D):
z = A * e **( ( x[0] - 224 )**2 + ( x[1] - 224 )**2 )/2*B + C * e**( ( x[0] - 224)**2 + ( x[1] - 224) **2 )/2D
return z
parameters, covariance = curve_fit(Cauchy, xdata, Z)
This leads to : OverflowError: (34, ‘Result too large’)
The mean of the data should be at the coordinates (224,224). The obvious problem is that the number e**(224**2) must now be stored at some point – which can’t be done with double precision floats. I have tried the decimalfp package, but this did not resolve it. I expect the numbers here are just too large. Currently I have resolved this problem by switching to a Cauchy distribution. Still, this problem could in principle be solved easily by reducing the value of the coordinates, so its bothering me. Any advice?
1