[ad_1]

Today, with millions of news pieces from various sources, many news readers access news online. News recommender systems (NRS) suggest selected news articles that would be interesting for news readers to assist users in finding the appropriate and pertinent content and thus eliminating the rest of the pieces, which in turn alleviates the problem of information overloading. The recommender system extracts news from many articles and filters them according to the reader’s interests by learning from users’ past experiences.

Filter bubbles, however, might result from recommendations that emphasize users’ interests excessively. Filter bubble occurs when recommendations based on user interests, such as search logs and access logs, resulting in the recommendation of only the information that the user wants to see, isolating the user from information that they do not want to see or that does not align with their ideas and isolating their beliefs and values like a “bubble.” Pariser defined the filter bubble in 2011. It has since generated much discussion on news recommendations and is one of the most crucial issues.

Although many approaches have already been put forth for news recommendation, deep learning-based models have recently been noted to perform very well. Many current techniques for deep learning-based news recommendation are based on Attention, which learns from past click logs to acquire user and news representations (vectors), and forecasts click rates for unidentified items. Recently, user and content representations have performed better by adopting pre-trained language models like BERT. Thus, recent years have seen much discussion on topics like increasing diversity and news recommendation systems. There has yet to be a proposal for news recommendations that emphasize the diversity of political opinions, particularly the disparities in political opinions among themes.

The major problem of the news recommendation system is that it can bias the political views of the readers based on their past activities. Political division among liberals and conservatives can result from prejudice against political positions in recommendation results. Thus, a group of researchers published a paper on Reducing Cross-Topic Political Homogenization in Content-Based News Recommendations. To combat filter bubbles in news suggestions, this research suggests a novel strategy focusing on political ideologies. The outcomes of news recommendations based on user interests are biased for various reasons, including emotional polarity and article content. In this study, liberal or conservative prejudice and political positions were given special consideration. 

Two different kinds of attention-based deep learning models were put out in this work. The first is an objective function that penalizes terms that describe political ideologies, whether liberal or conservative, that have been independently gathered by the authors so that they are less likely to impact the outcome of the prediction. The alternative strategy is to give topic-specific words more weight. A technique that incorporates both ways was also put to the test. 

In the proposed approach, the team considered text recommendation as a simple binary classification of whether the reader likes the article or not. The articles are labeled into two categories, “user prefers conservative articles in category 1 and liberal articles in category 2.” A dataset of 900,000 news stories from 41 different news websites was used in the experiment obtained from Liu et al. These news stories have five tiers of political positions with the labels 

-2,-1,0,1,2 on them. The scale ranges from -2 (most liberal) to +2 (most conservative). 100,000 samples from the data collection are used in this investigation. Political stances rather than topics were tagged in the 100,000 news stories collected. Therefore, unsupervised clustering was used to extract topics in this investigation.

The team tested the suggested method on a dataset of users who held opposing political views on two topics and discovered that it outperformed both the baseline and STN and STAN. A significant problem in the realm of recommendation systems is filter bubbles. It is anticipated that real-world implementations of recommendation systems that account for this range of political viewpoints will develop in the future.


Check out the Paper and Reference Article. All Credit For This Research Goes To the Researchers on This Project. Also, don’t forget to join our 13k+ ML SubRedditDiscord Channel, and Email Newsletter, where we share the latest AI research news, cool AI projects, and more.


Niharika is a Technical consulting intern at Marktechpost. She is a third year undergraduate, currently pursuing her B.Tech from Indian Institute of Technology(IIT), Kharagpur. She is a highly enthusiastic individual with a keen interest in Machine learning, Data science and AI and an avid reader of the latest developments in these fields.


[ad_2]
Source link