About: http://data.cimple.eu/news-article/1f81c36540b490f30f2ae5533ab6e1ef9ec0a5133e55b10ef2bec7cd     Goto   Sponge   NotDistinct   Permalink

An Entity of Type : schema:NewsArticle, within Data Space : data.cimple.eu associated with source document(s)

AttributesValues
rdf:type
schema:articleBody
  • Countries must do more to combat racial profiling, UN rights experts said Thursday, warning that artificial intelligence programmes like facial recognition and predictive policing risked reinforcing the harmful practice. Racial profiling is not new but the technologies once seen as tools for bringing more objectivity and fairness to policing appear in many places to be making the problem worse. "There is a great risk that (AI technologies will) reproduce and reinforce biases and aggravate or lead to discriminatory practices," Jamaican human rights expert Verene Shepherd told AFP. She is one of the 18 independent experts who make up the UN Committee on the Elimination of Racial Discrimination (CERD), which on Thursday published guidance on how countries worldwide should work to end racial profiling by law enforcement. The committee, which monitors compliance by 182 signatory countries to the International Convention on the Elimination of All Forms of Racial Discrimination, raised particular concern over the use of AI algorithms for so-called "predictive policing" and "risk assessment". The systems have been touted to help make better use of limited police budgets, but research suggests it can increase deployments to communities which have already been identified, rightly or wrongly, as high-crime zones. "Historical arrest data about a neighbourhood may reflect racially biased policing practices," Shepherd warned. "Such data will deepen the risk of over-policing in the same neighbourhood, which in turn may lead to more arrests, creating a dangerous feedback loop." When artificial intelligence and algorithms use biased historical data, their profiling predictions will reflect that. "Bad data in, bad results out," Shepherd said. "We are concerned about what goes into making those assumptions and those predictions." The CERD recommendations also take issue with the growing use of facial recognition and surveillance technologies in policing. Shepherd said the committee had received a number of complaints about misidentification by such technologies, sometimes with dire consequences, but did not provide specific examples. The issue came to the forefront with the wrongful arrest in Detroit earlier this year of an African American man, Robert Williams, based on a flawed algorithm which identified him as a robbery suspect. Various studies show facial recognition systems developed in Western countries are far less accurate in distinguishing darker-skinned faces, perhaps because they rely on databases containing more white, male faces. "We have had complaints of such misidentification because of where the technologies are coming from, who is making them, and what samples they have in their system," Shepherd said. "It is a real concern." CERD is calling for countries to regulate private companies that develop, sell or operate algorithmic profiling systems for law enforcement. Countries have a responsibility to ensure that such systems comply with international human rights law, it said, stressing the importance of transparency in design and application. The committee insisted the public should be informed when such systems are being used and told how they work, what data sets are being used and what safeguards are in place to prevent rights abuses. The recommendations meanwhile go beyond the impact of new technologies, urging countries to introduce laws against all forms of racial discrimination by law enforcement. "Racial profiling precedes these technologies," Shepherd said. She said 2020 -- a year marked by surging racial tensions in many parts of the world -- was a good time to present the new guidelines. The committee, she said, "hopes that the intensification and globalisation of Black Lives Matter ... and other campaigns calling for attention to discrimination against certain vulnerable groups will help (underline) the importance of the recommendations." nl/vog/ach
schema:headline
  • UN experts sound alarm over AI-enhanced racial profiling
schema:mentions
schema:author
schema:datePublished
http://data.cimple...sPoliticalLeaning
http://data.cimple...logy#hasSentiment
http://data.cimple...readability_score
http://data.cimple...tology#hasEmotion
Faceted Search & Find service v1.16.115 as of Oct 09 2023


Alternative Linked Data Documents: ODE     Content Formats:   [cxml] [csv]     RDF   [text] [turtle] [ld+json] [rdf+json] [rdf+xml]     ODATA   [atom+xml] [odata+json]     Microdata   [microdata+json] [html]    About   
This material is Open Knowledge   W3C Semantic Web Technology [RDF Data] Valid XHTML + RDFa
OpenLink Virtuoso version 07.20.3238 as of Jul 16 2024, on Linux (x86_64-pc-linux-musl), Single-Server Edition (126 GB total memory, 3 GB memory in use)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2025 OpenLink Software