2019:Research/Improving Knowledge Base Construction from Robust Infobox Extraction
This is an Accepted submission for the Research space at Wikimania 2019. |
Abstract
editA capable, automatic Question Answering (QA) system can provide more complete and accurate answers using a comprehensive knowledge base (KB). One important approach to constructing a comprehensive knowledge base is to extract information from Wikipedia infobox tables to populate an existing KB. Despite previous successes in the Infobox Extraction (IBE) problem (e.g., DBpedia), three major challenges remain: 1) Deterministic extraction patterns used in DBpedia are vulnerable to template changes; 2) Over-trusting Wikipedia anchor links can lead to entity disambiguation errors; 3) Heuristic-based extraction of unlinkable entities yields low precision, hurting both accuracy and completeness of the final KB. This paper presents a robust approach that tackles all three challenges. We build probabilistic models to predict relations between entity mentions directly from the infobox tables in HTML. The entity mentions are linked to identifiers in an existing KB if possible. The unlinkable ones are also parsed and preserved in the final output. Training data for both the relation extraction and the entity linking models are automatically generated using distant supervision. We demonstrate the empirical effectiveness of the proposed method in both precision and recall compared to a strong IBE baseline, DBpedia, with an absolute improvement of 41.3% in average F1. We also show that our extraction makes the final KB significantly more complete, improving the completeness score of list-value relation types by 61.4%. (paper)
Authors
editBoya Peng (Sentropy Technologies), Yejin Huh (Apple Inc.), Xiao Ling (Apple Inc.), Michele Banko (Sentropy Technologies)
Relevance to Wikimedia Community
editWe discuss a method for extracting structured information from Wikipedia infoboxes using Wikidata as a knowledge base for distant supervision.
Session type
edit22-min presentation.
Participants [subscribe here!]
edit# ...