Please use this identifier to cite or link to this item:
http://localhost:8080/xmlui/handle/123456789/3153
Title: | Entropy regularization for topic modelling |
Authors: | Sristy, Nagesh Bhattu Somayajulu, D. V. L. N. |
Issue Date: | 2014 |
Publisher: | ACM International Conference Proceeding Series |
Citation: | 10.1145/2662117.2662130 |
Abstract: | Supervised Latent Dirichlet based Topic Models are variants of Latent Dirichlet Topic Models with the additional capability to discriminate the samples. Light Supervision strategies are recently adopted to express the rich domain knowledge in the form of constraints. Posterior Regularization framework is developed for learning models from this weaker form of supervision expressing the set of constraints over the family of posteriors. Modelling arbitrary problem speci c dependencies is a non-trivial task, increasing the complexity of already harder inference problem in the context of latent dirichlet based topic models. In the current work we propose posterior regularization method for topic models to capture wide variety of auxiliary supervision. This approach simpli es the computational challenges posed by additional compound terms. We have demonstrated the use of this framework in improving the utility of topic models in the presence of entropy constraints. We have experimented with real word datasets to test the above mentioned techniques. |
Description: | NITW |
URI: | http://localhost:8080/xmlui/handle/123456789/3153 |
Appears in Collections: | Computer Science & Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
2662117.2662130.pdf | 452.64 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.