The following paper evaluates the impact of different y-axis ranges on bias and perceptions of effect sizes. They test three different versions:
The full condition showed the full range from 0 to 100 on a hypothetical memory test.
The minimal condition showed the smallest range necessary to see the data.
The standardized condition was centered on the group mean and extended by one to two SDs in either direction
Below is an example:
The research finds that, “participants’ sensitivity to the effect depicted in the graph was better when the y-axis range was between one to two standard deviations than with either the minimum range or the full range. In addition, bias was also smaller with the standardized axis range than the minimum or full axis ranges.”
As such, I was wondering how ggplot automatically determines it’s y-axis scaling? I know for certain it is not full, but I am struggling to determine if ggplot y-axis scaling is more akin to standardized or full. Below is an example of a ggplot plot:
df <- data.frame(dose=c("D0.5", "D1", "D2"),
len=c(4.2, 10, 29.5))
library(ggplot2)
p<-ggplot(data=df, aes(x=dose, y=len)) +
geom_bar(stat="identity")
p
2