About: http://data.cimple.eu/claim-review/2d57530a6d25942dd442cde4bce17c21291528de9b365fe70870f9ed     Goto   Sponge   NotDistinct   Permalink

An Entity of Type : schema:ClaimReview, within Data Space : data.cimple.eu associated with source document(s)

AttributesValues
rdf:type
http://data.cimple...lizedReviewRating
schema:url
schema:text
  • What was claimed An image shows the funeral of Iranian schoolchildren after they were killed in a missile strike. Our verdict This image is fake. What was claimed An image shows the funeral of Iranian schoolchildren after they were killed in a missile strike. Our verdict This image is fake. An image showing people leaning over body bags has been widely shared online, including by the Your Party MP Zarah Sultana, with claims it shows parents in Iran burying children killed in a missile strike. But while this strike on a school did reportedly happen, the image itself is fake. Iranian authorities said 168 people were killed in an attack on a school on 28 February during the first day of US-Israeli attacks on the country. Independent media outlets have not been able to verify details of the incident, but satellite imagery analysis shows multiple strikes and burn marks around a school in the city of Minab. Iranian state media has released images it says show graves being prepared for victims of the reported missile strike, but these do not match the image being widely shared online, including by Ms Sultana. Dr Siwei Lyu, an expert in digital media forensics at University at Buffalo, The State University of New York, told Full Fact his analysis showed the image was AI-generated. The account behind the earliest example we could find online said they “shared the AI-generated photo symbolically to reflect the scale of the tragedy”. We have contacted Ms Sultana for comment. Dr Lyu noted that several of the burial shrouds “display identical facial portraits, suggesting a copied digital pattern rather than unique individuals”. He also spotted several portraits in the middle of the picture that only contain “blurred, featureless smears instead of discernible human faces” which he says are “a hallmark of diffusion-model generation where the model fails to render coherent facial detail at mid-range depth”. The number of portraits placed on each body also appears to increase unnaturally as the scene recedes into the background, which Dr Lyu says is “a procedural generation artefact with no real-world physical explanation”. People in the background seem to blend into each other, and some have “warped, nonsensical limb structures”. “These distortions are a well-documented failure mode of current diffusion models when rendering high-density crowd scenes with complex depth,” Dr Lyu said. Other fact checkers have also concluded that the image is fake and was made with AI. You can read more of our fact checks about the ongoing conflict in the Middle East here. During developing news events it’s important to rely on sources that are trustworthy and verifiable before sharing content which you see on social media. Our toolkit and guides on identifying AI-generated content, have tips on how to do this. Full Fact fights for good, reliable information in the media, online, and in politics.
schema:reviewRating
schema:author
schema:datePublished
schema:inLanguage
  • English
schema:itemReviewed
Faceted Search & Find service v1.16.123 as of May 22 2025


Alternative Linked Data Documents: ODE     Content Formats:   [cxml] [csv]     RDF   [text] [turtle] [ld+json] [rdf+json] [rdf+xml]     ODATA   [atom+xml] [odata+json]     Microdata   [microdata+json] [html]    About   
This material is Open Knowledge   W3C Semantic Web Technology [RDF Data]
OpenLink Virtuoso version 07.20.3241 as of May 22 2025, on Linux (x86_64-pc-linux-musl), Single-Server Edition (126 GB total memory, 10 GB memory in use)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2026 OpenLink Software