Probing a Semantic Dependency Parser for Translational Relation Embeddings
Riley Capshaw, Marco Kuhlmann, and Eva Blomqvist. Probing a Semantic Dependency Parser for Translational Relation Embeddings. In Proceedings of the Workshop on Deep Learning for Knowledge Graphs (DL4KG2020) Co-located with the 17th Extended Semantic Web Conference 2020 (ESWC 2020), Heraklion, Greece – moved online, 2020.
Translational relation models are primarily applied to the task of Knowledge Graph embedding. We present a structural probe for testing whether a state-of-the-art semantic dependency parser learns con- textualized word representations which fit a translational relation model. We find that the parser does not explicitly learn a translational relation model. We do, however, find that a simple transformation of the word representations is enough to induce a TransE model with 73.45% label recall, indicating that translational relation models are at least implicitly learned by the parser. We believe that our findings can in the future be used to develop new Natural Language Understanding systems that are more useful for Knowledge Graph generation and completion.