I´m working on this project that process a user request and generate a report on csv format.
The data is read from a Postgres 15.2 DB. I´m using Springbatch where I use a JdbcCursorItemReader<Map<String, Object>> to retrieve data from a Postgres Function.
@Bean
@StepScope
public JdbcCursorItemReader<Map<String, Object>> rptReader(
@Value("#{jobParameters['functionName']}") String functionName,
@Value("#{jobParameters['costCenterId']}") Long costCenterId,
@Value("#{jobParameters['startDate']}") LocalDate startDate,
@Value("#{jobParameters['endDate']}") LocalDate endDate) {
// new Object[] { costCenterId.intValue(), startDate, endDate }))
log.info("RUNNING {}", functionName);
try {
log.info("AUTOCOMMIT VALUE {}", dataSource.getConnection().getAutoCommit());
} catch (SQLException e) {
e.printStackTrace();
}
return new JdbcCursorItemReaderBuilder<Map<String, Object>>()
.dataSource(dataSource)
.name("reader")
.preparedStatementSetter(new ArgumentPreparedStatementSetter(
new Object[] { costCenterId.intValue(), startDate, endDate }))
.sql(buildFunction(functionName))
.fetchSize(100)
.rowMapper(new ColumnMapRowMapper())
.build();
}
private String buildFunction(String functionName) {
var sql = new StringBuilder();
sql.append("SELECT * FROM ");
sql.append(functionName);
sql.append(" (?,?,?)");
return sql.toString();
}
I´m getting java.lang.OutOfMemoryError: Java heap space, and it seems this 1.2 million data is loaded in memory.
It seems, when using fetchSize against a Postgres function does not work. However, I have extracted the query from the function and ran without any issue.
Anyone has experience such scenario? If you did, can you shared yours thoughts?
I´m not an expert on Postgres Function, but I´m trying to change it to use cursor.
The spring version I´m using is 3.3.2