Search code examples
pythonhuggingface-transformershuggingfacepeft

Can I dynamically add or remove LoRA weights in the transformer library like diffusers


I see that in the diffuser library, there is this feature to dynamically add and remove LoRA weights based on this article https://github.com/huggingface/blog/blob/main/lora-adapters-dynamic-loading.md · GitHub and using the load_lora_weights and fuse_lora_weights. I want to know if I can do something similar with LoRA for transformers too?


Solution

  • In PEFT, when you create or load an adapter, you give it a name.

    Then you can enable the adapter(s) of your choice dynamically by name with https://huggingface.co/docs/peft/package_reference/lora#peft.LoraModel.set_adapter

    See the example of how to do this here: https://huggingface.co/docs/peft/en/developer_guides/lora#load-adapters