Why loading AutoTokenizer takes so much RAM?
I was measuring the RAM that is used by my script and I was surprised that it takes about 300Mb of RAM, while the tokenizer file itself is about 9MB. Why is that?
I was measuring the RAM that is used by my script and I was surprised that it takes about 300Mb of RAM, while the tokenizer file itself is about 9MB. Why is that?