💡 ML Concept of the Day: What is Prompt-Tuning?
Last week we explored prefix-tuning as one of the techniques that can be used to optimize foundation models without the need of complete fine tuning. A very popular alternative to prefix-tuning that shares the same principles is known as prompt tuning or p-tuning.
Prompt-tuning sits somewhere between prompt engineering and fine-tuning. Unlike prompt engineering, p-tuning doesn’t require constant refinement of prompts in order to achieve a specific outcome. Relative to fine-tuning, p-tuning doesn’t require modifying the weights of the model.