The Hugging Face adapter configuration file.

*## Schema

adapter_config.json must contain the following fields:

  • r - The number of LoRA ranks.
  • target_modules - A list of target modules.

Additional fields may be specified but are ignored.

Example

{
  "r": 4,
  "target_modules": [
    "k_proj",
    "q_proj",
    "v_proj",
  ]
}