I’ve been reading a few algorithm papers recently and they mostly discuss convergence rates, rather than time complexity. In my education so far we’ve always talked about time complexity, because it is interesting to understand how an algorithm scales with dimensionality. For example, in this paper, the convergence rate is given as O(1/t). My questions would be:
- What does this mean? Does t refer to the number of iterations?
- How does this link to time complexity? Why is time complexity not discussed in the paper? It would seem like an interesting question to find out how the algorithm scales.