12
Toward Syntax-Free Semantic Role Labeling with ChatGPT and GPT-4
Transformers have made more progress in the past few years than NLP in the past generation. Former NLP models would be trained to understand a language’s basic syntax before running Semantic Role Labeling (SRL). The NLP software contained syntax trees, rule bases, and parsers. The performance of such systems was limited by the number of combinations of words that led to an infinity of contexts.
Shi and Lin (2019) started their paper by asking if preliminary syntactic and lexical training can be skipped. Could a system become “syntax-free” and understand language without relying on pre-designed syntax trees? Could a BERT-based model perform SRL without going through those classical ...
Get Transformers for Natural Language Processing and Computer Vision - Third Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.