Biterm topic model of social network users’ sentiment by integrating word co-occurrence
With the increasing number of social network users in recent years,text-based user sentiment analysis technology has been widely concerned and applied.However,data sparsity and low accuracy often reduce the accuracy and speed of emotion recognition methods.The user emotion Biterm topic model (US-BTM...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | zho |
Published: |
Beijing Xintong Media Co., Ltd
2020-11-01
|
Series: | Dianxin kexue |
Subjects: | |
Online Access: | http://www.telecomsci.com/zh/article/doi/10.11959/j.issn.1000-0801.2020302/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | With the increasing number of social network users in recent years,text-based user sentiment analysis technology has been widely concerned and applied.However,data sparsity and low accuracy often reduce the accuracy and speed of emotion recognition methods.The user emotion Biterm topic model (US-BTM) was proposed which could find user preference and emotional tendency from the text of specific places,so as to effectively use Biterm for topic modeling.The strategy of user aggregation to form pseudo-documents was used,and word pairs were created for the whole corpus to solve the problems of data sparsity and short text.Then the topic was studied through the lexical co-occurrence model,so as to infer the topic with abundant corps-level information,and the purpose of accurately predicting the user’s interest,preference and emotion to the specific scene was achieved by analyzing the lexical matching set in the comment corpus under the specific scene and the emotion of the corresponding topic.The experimental results show that the method proposed can accurately capture users’ emotional tendency and correctly reveal users’ preference,which can be widely used in social network content description,recommendation,social network user interest description,semantic analysis and other fields. |
---|---|
ISSN: | 1000-0801 |