I need to plot the average of different trails of my sampled data y
. The difficult part is that the corresponding sample points x
do not coincide. However, x
can only range from 0 to 1.
The following minimal reproducable example
import matplotlib.pyplot as plt
x1 = [0.1, 0.2, 0.4, 0.6, 0.75, 0.9]
x2 = [0, 0.14, 0.53, 0.6, 0.81, 0.9, 0.98]
y1 = [1, 4, 6, 5, 5, 3]
y2 = [3, 4, 7, 9, 7, 4, 2]
plt.plot(x1, y1, marker='o')
plt.plot(x2, y2, marker='x')
plt.grid()
plt.xlabel('x')
plt.ylabel('y')
plt.show()
produces this plot.
What is the nicest, and most pythonic way to do that?
Is there any better solution opposed to linearly interpolate the data in order to create an x
scale that coincides?