Zero Label Language Learning
- Towards Zero-Label Language Learning
- Unsupervised Data Generation
- SuperGLUE
- Treat LMs as few-shot generators (rather than few-shot learners)
- Create prompts with <sample, label> pair(s)
- Ask the model to generate more for the same label
- The emphasis is on the labelled data generation (rather than inference)
- The new idea is about generating more data and going with conventional route
- This paper confirms all the above by introducing UDG using LMs, even for complex higher-order tasks and empirically shows classical fine-tuning with more data works better.