Background- Lube oil filters are used largely in manufacturing industries. These filters are changed once they are chocked/clogged. The intensity of clogging is determined by Pressure difference (DP) before and after the filter. If DP in 1 then filter is to be replaced and DP across new filter is 0.1.
These filters when in operation does have same life every time we replace them. Due to more impurities in line oil these can get clogged at an early stage. Correctly, determining the Rate of Change of filter DP that is the rate at which filter DP is increasing is very crucial for continuous operations.
Problem Statement- we want to make a predictive model which will learn the ideal rate of change from historical data and then correctly predict the rate of change during operation.
Use case- If this rate of change in filter DP is more than the past rate of change then some corrective action can be taken to identify the cause of increased rate of clogging and rectify the issue.
Data- data that we get is a time series data with values of filter DP.
Sample data is something like below.
| Timestamp| Filter_DP |
| ——– | ——– |
| 1-1-2024 | 0.2 |
| 2-1-2024 | 0.3 |
| 3-1-2024 | 0.35 |
| 4-1-2024 | 0.5 |
5-1-2024 0.6
6-1-2024. 0.7
7-1-2024. 0.8
8–1-2024. 0.9
9-1-2024. 1(filter change)
10-1-2024. 0.2
11-1-2024. 0.25
12-1-2024. 0.4
Etc
Can someone please give few ideas on how to develop such model?
If you need more data or more understanding of the question. Please do let me know.
I have tried the time series forecasting Holts method but challenge is to identify the point when filter is changed as this is not after equal duration.
Nitish Bhardwaj is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.