Joint learning of dependency parsing and semantic role labeling
When natural language processing tasks overlap in their linguistic input space, they can be technically merged. Applying machine learning algorithms to the new joint task and comparing the results of joint learning with disjoint learning of the original tasks may bring to light the linguistic relatedness of the two tasks. We present a joint learning experiment with dependency parsing and semantic role labeling of Catalan and Spanish. The developed systems are based on local memory-based classifiers predicting constraints on the syntactic and semantic dependency relations in the resulting graph based on the same input features. In a second global phase, a constraint satisfaction inference procedure produces a dependency graph and semantic role label assignments for all predicates in a sentence. The comparison between joint and disjoint learning shows that dependency parsing is better learned in a disjoint setting, while semantic role labeling benefits from joint learning. We explain the results by providing an analysis of the output of the systems.