aboutsummaryrefslogtreecommitdiff
path: root/misc/py-peft/pkg-descr
diff options
context:
space:
mode:
Diffstat (limited to 'misc/py-peft/pkg-descr')
-rw-r--r--misc/py-peft/pkg-descr14
1 files changed, 14 insertions, 0 deletions
diff --git a/misc/py-peft/pkg-descr b/misc/py-peft/pkg-descr
new file mode 100644
index 000000000000..c7205201b5f5
--- /dev/null
+++ b/misc/py-peft/pkg-descr
@@ -0,0 +1,14 @@
+The peft module contains state-of-the-art Parameter-Efficient Fine-Tuning
+(PEFT) methods.
+
+Fine-tuning large pretrained models is often prohibitively costly due to their
+scale. Parameter-Efficient Fine-Tuning (PEFT) methods enable efficient
+adaptation of large pretrained models to various downstream applications by only
+fine-tuning a small number of (extra) model parameters instead of all the
+model's parameters. This significantly decreases the computational and storage
+costs. Recent state-of-the-art PEFT techniques achieve performance comparable to
+fully fine-tuned models.
+
+PEFT is integrated with Transformers for easy model training and inference,
+Diffusers for conveniently managing different adapters, and Accelerate for
+distributed training and inference for really big models.