Text understanding models exploit semantic networks of words as basic components. Automatically enriching and expanding these resources is then an important challenge for NLP. Existing models for enriching semantic resources based on lexical-syntactic patterns make little use of structural properties of target semantic relations. In this paper, we propose a novel approach to include transitivity in probabilistic models for expanding semantic resources. We directly include transitivity in the formulation of probabilistic models. Experiments demonstrate that these models are an effective way for exploiting structural properties of relations in learning semantic networks.

Transitivity in semantic relation learning

FALLUCCHI F;
2010-01-01

Abstract

Text understanding models exploit semantic networks of words as basic components. Automatically enriching and expanding these resources is then an important challenge for NLP. Existing models for enriching semantic resources based on lexical-syntactic patterns make little use of structural properties of target semantic relations. In this paper, we propose a novel approach to include transitivity in probabilistic models for expanding semantic resources. We directly include transitivity in the formulation of probabilistic models. Experiments demonstrate that these models are an effective way for exploiting structural properties of relations in learning semantic networks.
2010
978-1-4244-68966
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14241/2187
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
social impact