About: http://data.cimple.eu/news-article/af47e6786cb3697d0c41397baa898da28b9b411401e4a640db7bc489     Goto   Sponge   NotDistinct   Permalink

An Entity of Type : schema:NewsArticle, within Data Space : data.cimple.eu associated with source document(s)

AttributesValues
rdf:type
schema:articleBody
  • Facebook on Wednesday defended itself against a report that it shelved internal research indicating that it was dividing people instead of bringing them together. The social media network's algorithms are aimed at getting users to spend more time on the site. But they "exploit the human brain's attraction to divisiveness," a slide from a 2018 presentation by a Facebook research team stated, according to the report in the Wall Street Journal. It warned that if left unchecked Facebook would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform." Facebook chief Mark Zuckerberg and other executives sidelined the research, however, based on concerns it was too paternalistic or would result in product changes that would rankle politically conservative users, the Journal reported. The company's integrity vice president, Guy Rosen, slammed the Journal story, saying the newspaper "willfully ignored critical facts that undermined its narrative". "The piece uses a couple of isolated initiatives we decided against as evidence that we don't care about the underlying issues and it ignored the significant efforts we did make," Rosen said in an online post. "As a result, readers were left with the impression we are ignoring an issue that in fact we have invested heavily in." The Journal report also cited a 2016 study at Facebook which showed that, among German political groups, "64% of all extremist group joins are due to our recommendation tools." "Our recommendation systems grow the problem," the report said. For years Facebook has faced criticism for allowing hatred to flourish on the network globally, with posts stoking divisions during the coronavirus pandemic being only the most recent example. One of the most notorious examples is in Myanmar, where the tech giant has been accused of being slow to respond to abusive posts portraying the country's Rohingya Muslims in sub-human terms, helping to drum up support for a military crackdown that forced more than 720,000 of the stateless minority to flee the country in 2017. Rosen did not deny the existence of the study, but pointed out moves Facebook has made since 2016 to fight misinformation, harassment, threats and other abusive behavior. "We've taken a number of important steps to reduce the amount of content that could drive polarization on our platform, sometimes at the expense of revenues," Rosen said. "This job won't ever be complete because at the end of the day, online discourse is an extension of society and ours is highly polarized." gc/st
schema:headline
  • Facebook denies sidelining research on site's 'divisiveness'
schema:mentions
schema:author
schema:datePublished
http://data.cimple...sPoliticalLeaning
http://data.cimple...logy#hasSentiment
http://data.cimple...readability_score
http://data.cimple...tology#hasEmotion
Faceted Search & Find service v1.16.115 as of Oct 09 2023


Alternative Linked Data Documents: ODE     Content Formats:   [cxml] [csv]     RDF   [text] [turtle] [ld+json] [rdf+json] [rdf+xml]     ODATA   [atom+xml] [odata+json]     Microdata   [microdata+json] [html]    About   
This material is Open Knowledge   W3C Semantic Web Technology [RDF Data] Valid XHTML + RDFa
OpenLink Virtuoso version 07.20.3238 as of Jul 16 2024, on Linux (x86_64-pc-linux-musl), Single-Server Edition (126 GB total memory, 5 GB memory in use)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2025 OpenLink Software