Strain interpolation overflow issue
When applied to large strain inputs data (typically around 4hz, T=2yrs data, i.e. DDPC Mojito requirements), the interpolant building of h+ and hx performed at ReadStrain initialization stage (set_strain method) leads to a memory overflow.
It has been identified that this is due to InterpolatedUnivariateSpline being deprecated and probably not optimal for large input data. Scipy now advises to switch to make_interp_spline, implementing a comparable algorithm.
Comparison of the two algos, using sinusoidal inputs far and close to Nyquist frequencies (@hinchaus ), as well with impulse function (@wkastaun) have demonstrated the equivalence of the interpolation outputs.
This issue has been identified and treated in ESA gitlab (https://gitlab.esa.int/lisa-sgs/sim/lisasim-bench/-/issues/2) in the context of DDPC simulation. We replicate the issue here as it requires a fix of lisagwresponse.