In this work, we explicitly describe the sentence distance as the weighted sum of contextualized token distances on the basis of a transportation problem, and then present the optimal transport-based distance measure, named RCMD it identifies and leverages semantically-aligned token pairs. However, the absence of an interpretation method for the sentence similarity makes it difficult to explain the model output. Recently, finetuning a pretrained language model to capture the similarity between sentence embeddings has shown the state-of-the-art performance on the semantic textual similarity (STS) task. Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)Īssociation for Computational Linguistics Toward Interpretable Semantic Textual Similarity via Optimal Transport-based Contrastive Sentence Learning Extensive experiments demonstrate that our learning framework outperforms other baselines on both STS and interpretable-STS benchmarks, indicating that it computes effective sentence similarity and also provides interpretation consistent with human judgement.", In the end, we propose CLRCMD, a contrastive learning framework that optimizes RCMD of sentence pairs, which enhances the quality of sentence similarity and their interpretation. Publisher = "Association for Computational Linguistics",Ībstract = "Recently, finetuning a pretrained language model to capture the similarity between sentence embeddings has shown the state-of-the-art performance on the semantic textual similarity (STS) task. Cite (Informal): Toward Interpretable Semantic Textual Similarity via Optimal Transport-based Contrastive Sentence Learning (Lee et al., ACL 2022) Copy Citation: BibTeX Markdown MODS XML Endnote More options… PDF: Software:Ģ.zip Video: Code sh0416/clrcmd Data MultiNLI, = "Toward Interpretable Semantic Textual Similarity via Optimal Transport-based Contrastive Sentence Learning",īooktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)", Association for Computational Linguistics. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 5969–5979, Dublin, Ireland. Toward Interpretable Semantic Textual Similarity via Optimal Transport-based Contrastive Sentence Learning. Anthology ID: 2022.acl-long.412 Volume: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Month: May Year: 2022 Address: Dublin, Ireland Editors: Smaranda Muresan,Īline Villavicencio Venue: ACL SIG: Publisher: Association for Computational Linguistics Note: Pages: 5969–5979 Language: URL: DOI: 10.18653/v1/2022.acl-long.412 Bibkey: lee-etal-2022-toward Cite (ACL): Seonghyeon Lee, Dongha Lee, Seongbo Jang, and Hwanjo Yu. Extensive experiments demonstrate that our learning framework outperforms other baselines on both STS and interpretable-STS benchmarks, indicating that it computes effective sentence similarity and also provides interpretation consistent with human judgement. Abstract Recently, finetuning a pretrained language model to capture the similarity between sentence embeddings has shown the state-of-the-art performance on the semantic textual similarity (STS) task.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |