hey im doing a paper regarding using hybrid deep learning models for bitcoin closing price prediction.
train_size = int(len(features) * 0.8)
train_data = features[:train_size]
test_data = features[train_size:]
scaler = MinMaxScaler(feature_range=(0, 1))
scaled_train_data = scaler.fit_transform(train_data)
scaled_test_data = scaler.transform(test_data)
ive seen 2 other questions regarding this but to no avail. ofcourse tried chatgpt and other papers no one really mentions about it. from the answers from the other 2 questions they mentioned dataset size and normalization problems which i have 15k rows of data and the code above shows the way i normalized.
the logic i set to predict was to use previous 20 hours for the future 3 hours. now im not quite understanding that for every prediction i make, despite using new combinations of data that i extracted from the original dataset, it still makes the same pattern of predictions which is an L shapped like moving up for a bit, then down hugely.
why is this pattern always repeating? my model consists of 1 layer LSTM, 1 GRU, and Dense trained over 200 epochs. i tried making the model most robust introducing another layer of LSTM and GRU, even dropouts.
any help or tips would be appreciated