Now I want to draw text using OpenGL with STB. I use LWJGL, but the question is about texture size in general.
To create a texture I use the stbtt_PackBegin method (then I transfer data from a ttf file). You need to pass a byte buffer (pixels) to it. The size of this buffer should depend on the size of the desired final image. I conclude that it should be the height of the font characters multiplied by the width of all the font characters. In the examples I have only seen abstract 1024×1024 or 512×512. My question is about these values.
Before calling stbtt_PackFontRange I call stbtt_PackSetOversampling. The documentation says:
The total number of pixels required is h_oversample*v_oversample larger than the default; for example, 2×2 oversampling requires 4x the storage of 1×1.
I conclude that the buffer size should be width x h_oversample x height x v_oversample
.
In fact, if you set a smaller width (almost twice), then this will be enough for correct display. That is, my theory about the width was not confirmed. But if you make it smaller, then distortions will be visible:
I also noticed that if the width value is NOT divisible by 4, the texture also turns out distorted:
That is, now my question is not what buffer size will be sufficient, but how exactly to calculate how many bytes are needed for the texture of all the characters of a font of a given height.
Update #1
Repository: https://github.com/kepocnhh/Lwjgl.TextsLogics/tree/wip
Here you can see the code that reads data from a TTF file and creates a texture. I want to save in one image all characters starting from code 32 (inclusive) to code 126 (inclusive) for a font 48 pixels high. This method describes the creation of a texture (where you need to set the pixel size). This method describes how to draw each symbol. The method onRenderTexts is called in a GLFW loop.
4