I am working with a dataset where timestamps are stored in a CSV file in local time. Let’s attempt to generate such timestamps:
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from datetime import datetime, timedelta
########################
# Part 1: Generating data
########################
# Step 1: Generate UTC data range
tArr = pd.date_range(
start="2022.01.01 00:00:00",
end="2023.01.01 00:00:00",
periods=365*24*4,
tz="UTC"
)#.tz_localize("UTC")
# Step 2: Convert it to europe time
tArrLocal = tArr.tz_convert('Europe/London')
# Step 3: Convert it to text
tArrLocalStr = tArrLocal.strftime('%Y-%m-%d %H:%M:%S')
########################
# Part 2: Reading generated data
########################
# Step 4: Attempt to read text data back to datetime
tArrLocalRead = pd.to_datetime(tArrLocalStr)
print(tArrLocalRead)
DatetimeIndex(['2022-01-01 00:00:00', '2022-01-01 00:15:00',
'2022-01-01 00:30:00', '2022-01-01 00:45:00',
'2022-01-01 01:00:00', '2022-01-01 01:15:00',
'2022-01-01 01:30:00', '2022-01-01 01:45:00',
'2022-01-01 02:00:00', '2022-01-01 02:15:00',
...
'2022-12-31 21:44:59', '2022-12-31 21:59:59',
'2022-12-31 22:14:59', '2022-12-31 22:29:59',
'2022-12-31 22:44:59', '2022-12-31 22:59:59',
'2022-12-31 23:14:59', '2022-12-31 23:29:59',
'2022-12-31 23:44:59', '2023-01-01 00:00:00'],
dtype='datetime64[ns]', length=35040, freq=None)
We generated a year worth of timestamps sampled every 15 minutes, converted them to an array of strings, and then read them back into a datetime array object. Let’s have a look at timedeltas:
to_timestamp = lambda dt: dt.values.astype(int) // 10**9
plt.figure()
plt.plot(np.diff(to_timestamp(tArrLocalRead)), linestyle='--', label='Read')
plt.plot(np.diff(to_timestamp(tArrLocal)), label='True')
plt.legend()
plt.show()
So, clearly, the datetime we have read from the file has two 1h jumps due to DST. It would make sense to make it datetime-aware, so the jumps would disappear. I have tried the function designed for it:
tArrLocalInfer = tArrLocalRead.tz_localize('Europe/Zurich')
and got the error NonExistentTimeError: 2022-03-27 02:03:29
I have also tried using the ambiguous inference flag:
tArrLocalInfer = tArrLocalRead.tz_localize('Europe/Zurich', ambiguous='infer')
and got the error AmbiguousTimeError: 2022-10-30 02:12:24
.
How do I read such data from a file, and infer the correct time zone? It is guaranteed that time in the file is sorted in a causal order (newer time always comes after older time), so, in my understanding, this should make the problem non-ambiguous.
There are several related questions online, but I have not seen a satisfactory solution solution so far. tz_localize AmbiguousTimeError: Cannot infer dst time with non DST dates suggests to manually specify which points are to be considered for DST in an array, but that seems to defeat the point of having automated DST inference. It appears that a related problem is still an open issue in pandas.