AccueilMes livresAjouter des livres
Découvrir
LivresAuteursLecteursCritiquesCitationsListesQuizGroupesQuestionsPrix BabelioRencontresLe Carnet
Canada
29 ans, né(e) le 11 mars

Lecteur inscrit le 11/03/2021

Learning More Efficient Language Models by Discounting the Effect of Words in Regular Expressions
Probabilistic models offer one of the most basic models for learning. However, they are limited in the number of hypotheses and the data structure they rely on. In this paper, we address these issues by modeling the probability of words in sentences as a function of word-level dependencies. We provide a non-parametric model based on the distribution between word pairs and a Bayesian model of distribution parameters of words, which is able to account for word-level dependencies. We also describe how to exploit the knowledge in our model to improve performance of the model. Specifically, we present a novel approach for the construction of an efficient model for word-level dependency based on conditional independence measures for determining the probability of a sentence to be written. Finally, we evaluate our model on both text and sentence-specific benchmark datasets and show how the proposed approach improves the prediction performance.
Actualités Fil RSS

Pour suivre ses dernières lectures ou découvrir ses critiques de livres :

connectez-vous
Contributions & insignes Voir plus
Derniers livres ajoutés


{* *}