i’m using Pandas to plot some data and i have the following Code to plot Multiple Pandas Dataframes on a common x-axis:
#place where my data comes from
list_of_dfs = [instance.parent_perspective.associated[expression] for expression in instance.parent_perspective.associated]
figure, axis = plt.subplots(len(list_of_dfs), 1, sharex=True)
for dataframe in list_of_dfs:
print(dataframe)
for i, dataframe in enumerate(list_of_dfs):
ax = dataframe.plot(ax=axis[i])
The dataframes are indexed by Pandas Timedelta objects and come in different sampling rates, so the differences between the Timedelta objects are different for some of the dataframes.
The code runs when i order the dataframes by samplerate from lower to higher, but when ordered from higher to lower samplerate, i get the following Error:
File "C:Users[me]..[my_script].py", line 110, in [my_script]
ax = dataframe.plot(ax=axis[i])
File "C:Users[me]AppDataRoamingPythonPython310site-packagespandasplotting_core.py", line 1030, in __call__
return plot_backend.plot(data, kind=kind, **kwargs)
File "C:Users[me]AppDataRoamingPythonPython310site-packagespandasplotting_matplotlib__init__.py", line 71, in plot
plot_obj.generate()
File "C:Users[me]AppDataRoamingPythonPython310site-packagespandasplotting_matplotlibcore.py", line 501, in generate
self._make_plot(fig)
File "C:Users[me]AppDataRoamingPythonPython310site-packagespandasplotting_matplotlibcore.py", line 1544, in _make_plot
newlines = plotf(
File "C:Users[me]AppDataRoamingPythonPython310site-packagespandasplotting_matplotlibcore.py", line 1589, in _ts_plot
freq, data = maybe_resample(data, ax, kwds)
File "C:Users[me]AppDataRoamingPythonPython310site-packagespandasplotting_matplotlibtimeseries.py", line 84, in maybe_resample
series.index = series.index.asfreq( # type: ignore[attr-defined]
AttributeError: 'TimedeltaIndex' object has no attribute 'asfreq'. Did you mean: 'freq'?
I’m assuming it has to do something with the way in which the pandas matplotlib plotting feature tries to marry common x-axes together if given different samplerates.
I’m also sharing the prints of my dataframes so you can make sure that i’m not handing over bad data.
From lower to higher samplerate:
value
0 days 00:00:00 0.0
0 days 00:00:00.010000 0.0
0 days 00:00:00.020000 0.0
0 days 00:00:00.030000 0.0
0 days 00:00:00.040000 0.0
… …
0 days 00:10:24.950000 0.0
0 days 00:10:24.960000 0.0
0 days 00:10:24.970000 0.0
0 days 00:10:24.980000 0.0
0 days 00:10:24.990000 0.0[62500 rows x 1 columns]
value
0 days 00:00:00 0.0
0 days 00:00:00.010000 0.0
0 days 00:00:00.020000 0.0
0 days 00:00:00.030000 0.0
0 days 00:00:00.040000 0.0
… …
0 days 00:10:24.950000 0.1
0 days 00:10:24.960000 0.1
0 days 00:10:24.970000 0.1
0 days 00:10:24.980000 0.1
0 days 00:10:24.990000 0.1[62500 rows x 1 columns]
value
-1 days +23:59:59.999100 0.031557
-1 days +23:59:59.999200 0.032379
-1 days +23:59:59.999300 0.031419
-1 days +23:59:59.999400 0.032287
-1 days +23:59:59.999500 0.031857
… …
0 days 00:10:24.998600 6.252964
0 days 00:10:24.998700 6.251458
0 days 00:10:24.998800 6.251105
0 days 00:10:24.998900 6.248893
0 days 00:10:24.999000 6.248886[6250000 rows x 1 columns]
And finally from higher to lower samplerate, this is the one that crashes! (same data, just the high samplerate guy was put first in the order):
value
-1 days +23:59:59.999100 0.031557
-1 days +23:59:59.999200 0.032379
-1 days +23:59:59.999300 0.031419
-1 days +23:59:59.999400 0.032287
-1 days +23:59:59.999500 0.031857
… …
0 days 00:10:24.998600 6.252964
0 days 00:10:24.998700 6.251458
0 days 00:10:24.998800 6.251105
0 days 00:10:24.998900 6.248893
0 days 00:10:24.999000 6.248886[6250000 rows x 1 columns]
value
0 days 00:00:00 0.0
0 days 00:00:00.010000 0.0
0 days 00:00:00.020000 0.0
0 days 00:00:00.030000 0.0
0 days 00:00:00.040000 0.0
… …
0 days 00:10:24.950000 0.1
0 days 00:10:24.960000 0.1
0 days 00:10:24.970000 0.1
0 days 00:10:24.980000 0.1
0 days 00:10:24.990000 0.1[62500 rows x 1 columns]
value
0 days 00:00:00 0.0
0 days 00:00:00.010000 0.0
0 days 00:00:00.020000 0.0
0 days 00:00:00.030000 0.0
0 days 00:00:00.040000 0.0
… …
0 days 00:10:24.950000 0.0
0 days 00:10:24.960000 0.0
0 days 00:10:24.970000 0.0
0 days 00:10:24.980000 0.0
0 days 00:10:24.990000 0.0
Alrie is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.