I have fine-tuned a Phi-3 for 1 epoch using LoRA as follows:
self.model=AutoModelForCausalLM.from_pretrained(self.model_args.model_name_or_path,**kwarg)
self.model = get_peft_model(self.model, self.lora_config)
Now I want resume from existing LoRA weights by loading the weights as below:
self.model = PeftModel.from_pretrained(self.model , adapter)
self.model.is_trainable = True
This return following error:
optimizer = DeepSpeedZeroOptimizer_Stage3(
File "/PATH/miniconda3/envs/llms/lib/python3.10/site-packages/deepspeed/runtime/zero/stage3.py", line 149, in __init__
self.dtype = self.optimizer.param_groups[0]['params'][0].dtype
I can restarted LoRA fine-tuning from start but resuming fails with optimizer error as above.