Please use this identifier to cite or link to this item: http://localhost:8080/xmlui/handle/123456789/3153
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSristy, Nagesh Bhattu-
dc.contributor.authorSomayajulu, D. V. L. N.-
dc.date.accessioned2025-02-06T06:06:47Z-
dc.date.available2025-02-06T06:06:47Z-
dc.date.issued2014-
dc.identifier.citation10.1145/2662117.2662130en_US
dc.identifier.urihttp://localhost:8080/xmlui/handle/123456789/3153-
dc.descriptionNITWen_US
dc.description.abstractSupervised Latent Dirichlet based Topic Models are variants of Latent Dirichlet Topic Models with the additional capability to discriminate the samples. Light Supervision strategies are recently adopted to express the rich domain knowledge in the form of constraints. Posterior Regularization framework is developed for learning models from this weaker form of supervision expressing the set of constraints over the family of posteriors. Modelling arbitrary problem speci c dependencies is a non-trivial task, increasing the complexity of already harder inference problem in the context of latent dirichlet based topic models. In the current work we propose posterior regularization method for topic models to capture wide variety of auxiliary supervision. This approach simpli es the computational challenges posed by additional compound terms. We have demonstrated the use of this framework in improving the utility of topic models in the presence of entropy constraints. We have experimented with real word datasets to test the above mentioned techniques.en_US
dc.language.isoenen_US
dc.publisherACM International Conference Proceeding Seriesen_US
dc.titleEntropy regularization for topic modellingen_US
dc.typeOtheren_US
Appears in Collections:Computer Science & Engineering

Files in This Item:
File Description SizeFormat 
2662117.2662130.pdf452.64 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.