Text understanding models exploit semantic networks of words as basic components. Automatically enriching and expanding these resources is then an important challenge for NLP. Existing models for enriching semantic resources based on lexical-syntactic patterns make little use of structural properties of target semantic relations. In this paper, we propose a novel approach to include transitivity in probabilistic models for expanding semantic resources. We directly include transitivity in the formulation of probabilistic models. Experiments demonstrate that these models are an effective way for exploiting structural properties of relations in learning semantic networks.
Transitivity in semantic relation learning
FALLUCCHI F;
2010-01-01
Abstract
Text understanding models exploit semantic networks of words as basic components. Automatically enriching and expanding these resources is then an important challenge for NLP. Existing models for enriching semantic resources based on lexical-syntactic patterns make little use of structural properties of target semantic relations. In this paper, we propose a novel approach to include transitivity in probabilistic models for expanding semantic resources. We directly include transitivity in the formulation of probabilistic models. Experiments demonstrate that these models are an effective way for exploiting structural properties of relations in learning semantic networks.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.