I’m using pure js version of ag grid.
my code for cell style looks something like this
cellStyle: params => {
return getCellColor(params); // my custom func to determine cell color based on value
}
here is a simplified version of getCellColor(), the actual function is more complex and colors cells in a gradient based on distance from min, max, and average or median.
function getCellColor(params){
let { min, max, median, average } = calculateMinMaxMedianAverage(params.api, params.colDef.field); // determines these values for all cells in column
if(params.value > 0){
return {"color": "rgba(255,255,255,1)"}
}else{
return {"color": "rgba(0,0,0,1)"}
}
}
Now as i mentioned before my real function color codes based on distance from min, max, and average or median. This means a really high outlier could drastically change the color coding.
My calculateMinMaxMedianAverage() function ultimately loops through each node via params.api.forEachNode. each node seems to have a node.displayed value of true/false which as far as i can tell represents which cells are expected to be visible in the grid (regardless of if it’s actually rendered or not due to scrolling virtualization). So it seemed like if excluded nodes with node.displayed == false from the calculateMinMaxMedianAverage() calculations that it should work. However, this didn’t seem to be the case as my color coding results seemed the same either way.
is there something i’m missing?