Learning from Rules Generalizing Labeled Exemplars

Abhijeet Awasthi, Sabyasachi Ghosh, Rasna Goyal, Sunita Sarawagi

Keywords: denoising, weakly supervised learning, denoising, noisy labels, nlp, weakly supervised learning, semi supervised learning, text classification

Wed Session 1 (05:00-07:00 GMT) [Live QA] [Cal]
Wed Session 4 (17:00-19:00 GMT) [Live QA] [Cal]
Wednesday: Symbols and Discovery

Abstract: In many applications labeled data is not readily available, and needs to be collected via pain-staking human supervision. We propose a rule-exemplar method for collecting human supervision to combine the efficiency of rules with the quality of instance labels. The supervision is coupled such that it is both natural for humans and synergistic for learning. We propose a training algorithm that jointly denoises rules via latent coverage variables, and trains the model through a soft implication loss over the coverage and label variables. The denoised rules and trained model are used jointly for inference. Empirical evaluation on five different tasks shows that (1) our algorithm is more accurate than several existing methods of learning from a mix of clean and noisy supervision, and (2) the coupled rule-exemplar supervision is effective in denoising rules.

Similar Papers

Disentangling Factors of Variations Using Few Labels
Francesco Locatello, Michael Tschannen, Stefan Bauer, Gunnar Rätsch, Bernhard Schölkopf, Olivier Bachem,
Weakly Supervised Disentanglement with Guarantees
Rui Shu, Yining Chen, Abhishek Kumar, Stefano Ermon, Ben Poole,
SELF: Learning to Filter Noisy Labels with Self-Ensembling
Duc Tam Nguyen, Chaithanya Kumar Mummadi, Thi Phuong Nhung Ngo, Thi Hoai Phuong Nguyen, Laura Beggel, Thomas Brox,
Differentiable learning of numerical rules in knowledge graphs
Po-Wei Wang, Daria Stepanova, Csaba Domokos, J. Zico Kolter,